dataloadgen

package module
v0.0.5 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Aug 6, 2023 License: MIT Imports: 6 Imported by: 16

README

dataloadgen

godoc

dataloadgen is an implementation of a pattern popularized by Facebook's Dataloader.

It works as follows:

  • A Loader object is created per graphql request.
  • Each of many concurrently executing graphql resolver functions call Load() on the Loader object with different keys. Let's say K1, K2, K3
  • Each call to Load() with a new key is delayed slightly (a few milliseconds) so that the Loader can load them together.
  • The customizable fetch function of the loader takes a list of keys and loads data for all of them in a single batched request to the data storage layer. It might send [K1,K2,K3] and get back [V1,V2,V3].
  • The Loader takes case of sending the right resight to the right caller and the result is cached for the duration of the graphql request.

Usage:

go get github.com/vikstrous/dataloadgen

See the usage example in the documentation:

package main

import (
	"context"
	"fmt"
	"strconv"
	"time"

	"github.com/vikstrous/dataloadgen"
)

// fetchFn is shown as a function here, but it might work better as a method
func fetchFn(keys []string) (ret []int, errs []error) {
    for _, key := range keys {
        num, err := strconv.ParseInt(key, 10, 32)
        ret = append(ret, int(num))
        errs = append(errs, err)
    }
    return
}

func main() {
    ctx := context.Background()
    // Per-request setup code:
    loader := dataloadgen.NewLoader(fetchFn)
    // In every graphql resolver:
    result, err := loader.Load(ctx, "1")
    if err != nil {
        panic(err)
    }
    fmt.Println(result)
}

Comparison to others

dataloaden uses code generation and has similar performance dataloader does not use code generation but has much worse performance and is more difficult to use

Benchmarks show that this package is faster than both of the above and I find it easier to use.

BenchmarkDataloader/caches-8                             4152324               270.3 ns/op           168 B/op          5 allocs/op
BenchmarkDataloader/random_spread-8                      1000000              1281 ns/op             626 B/op         11 allocs/op
BenchmarkDataloader/concurently-8                          33159             55575 ns/op           32649 B/op        160 allocs/op
BenchmarkDataloader/all_in_one_request-8                   10000           7556166 ns/op         2574411 B/op      60032 allocs/op

BenchmarkDataloaden/caches-8                            17960090                67.73 ns/op           24 B/op          1 allocs/op
BenchmarkDataloaden/random_spread-8                      1223949               955.0 ns/op           279 B/op          5 allocs/op
BenchmarkDataloaden/concurently-8                          27093             43594 ns/op            2867 B/op         76 allocs/op
BenchmarkDataloaden/all_in_one_request-8                   10000           1410499 ns/op          487876 B/op      10007 allocs/op

BenchmarkDataloadgen/caches-8                           22032517                53.61 ns/op            8 B/op          0 allocs/op
BenchmarkDataloadgen/random_spread-8                     2558128               483.7 ns/op           287 B/op          4 allocs/op
BenchmarkDataloadgen/concurently-8                         31900             34903 ns/op            2906 B/op         71 allocs/op
BenchmarkDataloadgen/all_in_one_request-8                  10000           1032841 ns/op          573619 B/op          7 allocs/op

Documentation

Index

Examples

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type ErrorSlice added in v0.0.5

type ErrorSlice []error

ErrorSlice represents a list of errors that contains at least one error

func (ErrorSlice) Error added in v0.0.5

func (e ErrorSlice) Error() string

Error implements the error interface

type Loader

type Loader[KeyT comparable, ValueT any] struct {
	// contains filtered or unexported fields
}

Loader batches and caches requests

Example
package main

import (
	"context"
	"fmt"
	"strconv"
	"time"

	"github.com/vikstrous/dataloadgen"
)

func main() {
	ctx := context.Background()
	loader := dataloadgen.NewLoader(func(keys []string) (ret []int, errs []error) {
		for _, key := range keys {
			num, err := strconv.ParseInt(key, 10, 32)
			ret = append(ret, int(num))
			errs = append(errs, err)
		}
		return
	},
		dataloadgen.WithBatchCapacity(1),
		dataloadgen.WithWait(16*time.Millisecond),
	)
	one, err := loader.Load(ctx, "1")
	if err != nil {
		panic(err)
	}
	fmt.Println(one)
}
Output:

1

func NewLoader

func NewLoader[KeyT comparable, ValueT any](fetch func(keys []KeyT) ([]ValueT, []error), options ...Option) *Loader[KeyT, ValueT]

NewLoader creates a new GenericLoader given a fetch, wait, and maxBatch

func (*Loader[KeyT, ValueT]) Clear

func (l *Loader[KeyT, ValueT]) Clear(key KeyT)

Clear the value at key from the cache, if it exists

func (*Loader[KeyT, ValueT]) Load

func (l *Loader[KeyT, ValueT]) Load(ctx context.Context, key KeyT) (ValueT, error)

Load a ValueT by key, batching and caching will be applied automatically

func (*Loader[KeyT, ValueT]) LoadAll

func (l *Loader[KeyT, ValueT]) LoadAll(ctx context.Context, keys []KeyT) ([]ValueT, error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*Loader[KeyT, ValueT]) LoadAllThunk

func (l *Loader[KeyT, ValueT]) LoadAllThunk(ctx context.Context, keys []KeyT) func() ([]ValueT, error)

LoadAllThunk returns a function that when called will block waiting for a ValueT. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*Loader[KeyT, ValueT]) LoadThunk

func (l *Loader[KeyT, ValueT]) LoadThunk(ctx context.Context, key KeyT) func() (ValueT, error)

LoadThunk returns a function that when called will block waiting for a ValueT. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*Loader[KeyT, ValueT]) Prime

func (l *Loader[KeyT, ValueT]) Prime(key KeyT, value ValueT) bool

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.Clear(key).Prime(key, value).)

type Option added in v0.0.2

type Option func(*loaderConfig)

Option allows for configuration of loader fields.

func WithBatchCapacity added in v0.0.2

func WithBatchCapacity(c int) Option

WithBatchCapacity sets the batch capacity. Default is 0 (unbounded)

func WithTracer added in v0.0.4

func WithTracer(tracer trace.Tracer) Option

func WithWait added in v0.0.2

func WithWait(d time.Duration) Option

WithWait sets the amount of time to wait before triggering a batch. Default duration is 16 milliseconds.

Directories

Path Synopsis
benchmark module

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL