dataloader

package
v0.0.0-...-4fdfd55 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Sep 1, 2019 License: MIT Imports: 7 Imported by: 0

README

Dataloader

Adding a dataloader
  1. Add appropriate generation header in loaders.go (see examples in the file already)
  2. go generate dataloader/loaders.go
  3. Add new loader to the loader constructor function

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type AnonymousAliasLoader

type AnonymousAliasLoader struct {
	// contains filtered or unexported fields
}

AnonymousAliasLoader batches and caches requests

func (*AnonymousAliasLoader) Clear

func (l *AnonymousAliasLoader) Clear(key globalid.ID)

Clear the value at key from the cache, if it exists

func (*AnonymousAliasLoader) Load

Load a anonymousAlias by key, batching and caching will be applied automatically

func (*AnonymousAliasLoader) LoadAll

func (l *AnonymousAliasLoader) LoadAll(keys []globalid.ID) ([]*model.AnonymousAlias, []error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*AnonymousAliasLoader) LoadThunk

func (l *AnonymousAliasLoader) LoadThunk(key globalid.ID) func() (*model.AnonymousAlias, error)

LoadThunk returns a function that when called will block waiting for a anonymousAlias. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*AnonymousAliasLoader) Prime

func (l *AnonymousAliasLoader) Prime(key globalid.ID, value *model.AnonymousAlias) bool

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)

type CardLoader

type CardLoader struct {
	// contains filtered or unexported fields
}

CardLoader batches and caches requests

func (*CardLoader) Clear

func (l *CardLoader) Clear(key globalid.ID)

Clear the value at key from the cache, if it exists

func (*CardLoader) Load

func (l *CardLoader) Load(key globalid.ID) (*model.Card, error)

Load a card by key, batching and caching will be applied automatically

func (*CardLoader) LoadAll

func (l *CardLoader) LoadAll(keys []globalid.ID) ([]*model.Card, []error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*CardLoader) LoadThunk

func (l *CardLoader) LoadThunk(key globalid.ID) func() (*model.Card, error)

LoadThunk returns a function that when called will block waiting for a card. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*CardLoader) Prime

func (l *CardLoader) Prime(key globalid.ID, value *model.Card) bool

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)

type DBTimeLoader

type DBTimeLoader struct {
	// contains filtered or unexported fields
}

DBTimeLoader batches and caches requests

func (*DBTimeLoader) Clear

func (l *DBTimeLoader) Clear(key globalid.ID)

Clear the value at key from the cache, if it exists

func (*DBTimeLoader) Load

func (l *DBTimeLoader) Load(key globalid.ID) (*model.DBTime, error)

Load a dBTime by key, batching and caching will be applied automatically

func (*DBTimeLoader) LoadAll

func (l *DBTimeLoader) LoadAll(keys []globalid.ID) ([]*model.DBTime, []error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*DBTimeLoader) LoadThunk

func (l *DBTimeLoader) LoadThunk(key globalid.ID) func() (*model.DBTime, error)

LoadThunk returns a function that when called will block waiting for a dBTime. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*DBTimeLoader) Prime

func (l *DBTimeLoader) Prime(key globalid.ID, value *model.DBTime) bool

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)

type FeedEntriesByIDLoader

type FeedEntriesByIDLoader struct {
	// contains filtered or unexported fields
}

FeedEntriesByIDLoader batches and caches requests

func (*FeedEntriesByIDLoader) Clear

func (l *FeedEntriesByIDLoader) Clear(key globalid.ID)

Clear the value at key from the cache, if it exists

func (*FeedEntriesByIDLoader) Load

Load a feedEntriesByID by key, batching and caching will be applied automatically

func (*FeedEntriesByIDLoader) LoadAll

func (l *FeedEntriesByIDLoader) LoadAll(keys []globalid.ID) ([]*model.FeedEntriesByID, []error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*FeedEntriesByIDLoader) LoadThunk

func (l *FeedEntriesByIDLoader) LoadThunk(key globalid.ID) func() (*model.FeedEntriesByID, error)

LoadThunk returns a function that when called will block waiting for a feedEntriesByID. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*FeedEntriesByIDLoader) Prime

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)

type IDTimeRangeKey

type IDTimeRangeKey struct {
	// contains filtered or unexported fields
}

func NewIDTimeRangeKey

func NewIDTimeRangeKey(id globalid.ID, from, to time.Time) IDTimeRangeKey

type InviteLoader

type InviteLoader struct {
	// contains filtered or unexported fields
}

InviteLoader batches and caches requests

func (*InviteLoader) Clear

func (l *InviteLoader) Clear(key globalid.ID)

Clear the value at key from the cache, if it exists

func (*InviteLoader) Load

func (l *InviteLoader) Load(key globalid.ID) (*model.Invite, error)

Load a invite by key, batching and caching will be applied automatically

func (*InviteLoader) LoadAll

func (l *InviteLoader) LoadAll(keys []globalid.ID) ([]*model.Invite, []error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*InviteLoader) LoadThunk

func (l *InviteLoader) LoadThunk(key globalid.ID) func() (*model.Invite, error)

LoadThunk returns a function that when called will block waiting for a invite. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*InviteLoader) Prime

func (l *InviteLoader) Prime(key globalid.ID, value *model.Invite) bool

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)

type Loaders

type Loaders struct {
	UserByID             *UserLoader
	AliasByID            *AnonymousAliasLoader
	CardByID             *CardLoader
	InvitesByID          *InviteLoader
	UserEngagementLoader *UserEngagementLoader
	LastActiveAtByID     *DBTimeLoader
}

func NewLoaders

func NewLoaders(store *store.Store, wait time.Duration) Loaders

type ReactionSliceLoader

type ReactionSliceLoader struct {
	// contains filtered or unexported fields
}

ReactionSliceLoader batches and caches requests

func (*ReactionSliceLoader) Clear

func (l *ReactionSliceLoader) Clear(key globalid.ID)

Clear the value at key from the cache, if it exists

func (*ReactionSliceLoader) Load

func (l *ReactionSliceLoader) Load(key globalid.ID) ([]model.Reaction, error)

Load a reaction by key, batching and caching will be applied automatically

func (*ReactionSliceLoader) LoadAll

func (l *ReactionSliceLoader) LoadAll(keys []globalid.ID) ([][]model.Reaction, []error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*ReactionSliceLoader) LoadThunk

func (l *ReactionSliceLoader) LoadThunk(key globalid.ID) func() ([]model.Reaction, error)

LoadThunk returns a function that when called will block waiting for a reaction. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*ReactionSliceLoader) Prime

func (l *ReactionSliceLoader) Prime(key globalid.ID, value []model.Reaction) bool

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)

type UserEngagementLoader

type UserEngagementLoader struct {
	// contains filtered or unexported fields
}

UserEngagementLoader batches and caches requests

func (*UserEngagementLoader) Clear

func (l *UserEngagementLoader) Clear(key IDTimeRangeKey)

Clear the value at key from the cache, if it exists

func (*UserEngagementLoader) Load

Load a userEngagement by key, batching and caching will be applied automatically

func (*UserEngagementLoader) LoadAll

func (l *UserEngagementLoader) LoadAll(keys []IDTimeRangeKey) ([]*model.UserEngagement, []error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*UserEngagementLoader) LoadThunk

func (l *UserEngagementLoader) LoadThunk(key IDTimeRangeKey) func() (*model.UserEngagement, error)

LoadThunk returns a function that when called will block waiting for a userEngagement. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*UserEngagementLoader) Prime

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)

type UserLoader

type UserLoader struct {
	// contains filtered or unexported fields
}

UserLoader batches and caches requests

func (*UserLoader) Clear

func (l *UserLoader) Clear(key globalid.ID)

Clear the value at key from the cache, if it exists

func (*UserLoader) Load

func (l *UserLoader) Load(key globalid.ID) (*model.User, error)

Load a user by key, batching and caching will be applied automatically

func (*UserLoader) LoadAll

func (l *UserLoader) LoadAll(keys []globalid.ID) ([]*model.User, []error)

LoadAll fetches many keys at once. It will be broken into appropriate sized sub batches depending on how the loader is configured

func (*UserLoader) LoadThunk

func (l *UserLoader) LoadThunk(key globalid.ID) func() (*model.User, error)

LoadThunk returns a function that when called will block waiting for a user. This method should be used if you want one goroutine to make requests to many different data loaders without blocking until the thunk is called.

func (*UserLoader) Prime

func (l *UserLoader) Prime(key globalid.ID, value *model.User) bool

Prime the cache with the provided key and value. If the key already exists, no change is made and false is returned. (To forcefully prime the cache, clear the key first with loader.clear(key).prime(key, value).)

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL