storage

package
v0.2.2-alpha.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 24, 2019 License: Apache-2.0 Imports: 27 Imported by: 66

Documentation

Overview

Defines extensible storage interface. This package registers "storage" config section that maps to Config struct. Use NewDataStore(cfg) to initialize a DataStore with the provided config. The package provides default implementation to access local, S3 (and minio), and In-Memory storage. Use NewCompositeDataStore to swap any portions of the DataStore interface with an external implementation (e.g. a cached protobuf store). The underlying storage is provided by extensible "stow" library. You can use NewStowRawStore(cfg) to create a Raw store based on any other stow-supported configs (e.g. Azure Blob Storage)

Index

Examples

Constants

View Source
const (
	KiB int64 = 1024
	MiB int64 = 1024 * KiB
)

Variables

View Source
var (
	ConfigSection = config.MustRegisterSection(configSectionKey, defaultConfig)
)
View Source
var ErrExceedsLimit = fmt.Errorf("limit exceeded")

Functions

func IsExceedsLimit

func IsExceedsLimit(err error) bool

Gets a value indicating whether the root cause of error is a "limit exceeded" error.

func IsExists

func IsExists(err error) bool

Gets a value indicating whether the underlying error is "already exists" error.

func IsNotFound

func IsNotFound(err error) bool

Gets a value indicating whether the underlying error is a Not Found error.

Types

type CachingConfig

type CachingConfig struct {
	// Maximum size of the cache where the Blob store data is cached in-memory
	// Refer to https://github.com/coocood/freecache to understand how to set the value
	// If not specified or set to 0, cache is not used
	// NOTE: if Object sizes are larger than 1/1024 of the cache size, the entry will not be written to the cache
	// Also refer to https://github.com/coocood/freecache/issues/17 to understand how to set the cache
	MaxSizeMegabytes int `` /* 149-byte string literal not displayed */
	// sets the garbage collection target percentage:
	// a collection is triggered when the ratio of freshly allocated data
	// to live data remaining after the previous collection reaches this percentage.
	// refer to https://golang.org/pkg/runtime/debug/#SetGCPercent
	// If not specified or set to 0, GC percent is not tweaked
	TargetGCPercent int `json:"target_gc_percent" pflag:",Sets the garbage collection target percentage."`
}

type ComposedProtobufStore

type ComposedProtobufStore interface {
	RawStore
	ProtobufStore
}

A ProtobufStore needs a RawStore to get the RawData. This interface provides all the necessary components to make Protobuf fetching work

type Config

type Config struct {
	Type          Type             `json:"type" pflag:",Sets the type of storage to configure [s3/minio/local/mem]."`
	Connection    ConnectionConfig `json:"connection"`
	InitContainer string           `json:"container" pflag:",Initial container to create -if it doesn't exist-.'"`
	// Caching is recommended to improve the performance of underlying systems. It caches the metadata and resolving
	// inputs is accelerated. The size of the cache is large so understand how to configure the cache.
	// TODO provide some default config choices
	// If this section is skipped, Caching is disabled
	Cache  CachingConfig `json:"cache"`
	Limits LimitsConfig  `json:"limits" pflag:",Sets limits for stores."`
}

A common storage config.

func GetConfig

func GetConfig() *Config

Retrieve current global config for storage.

func (Config) GetPFlagSet

func (cfg Config) GetPFlagSet(prefix string) *pflag.FlagSet

GetPFlagSet will return strongly types pflags for all fields in Config and its nested types. The format of the flags is json-name.json-sub-name... etc.

type ConnectionConfig

type ConnectionConfig struct {
	Endpoint   config.URL `json:"endpoint" pflag:",URL for storage client to connect to."`
	AuthType   string     `json:"auth-type" pflag:",Auth Type to use [iam,accesskey]."`
	AccessKey  string     `json:"access-key" pflag:",Access key to use. Only required when authtype is set to accesskey."`
	SecretKey  string     `json:"secret-key" pflag:",Secret to use when accesskey is set."`
	Region     string     `json:"region" pflag:",Region to connect to."`
	DisableSSL bool       `json:"disable-ssl" pflag:",Disables SSL connection. Should only be used for development."`
}

Defines connection configurations.

type DataReference

type DataReference string

Defines a reference to data location.

func (DataReference) Split

func (r DataReference) Split() (scheme, container, key string, err error)

Splits the data reference into parts.

func (DataReference) String

func (r DataReference) String() string

type DataStore

type DataStore struct {
	ComposedProtobufStore
	ReferenceConstructor
}

A simplified interface for accessing and storing data in one of the Cloud stores. Today we rely on Stow for multi-cloud support, but this interface abstracts that part

func NewCompositeDataStore

func NewCompositeDataStore(refConstructor ReferenceConstructor, composedProtobufStore ComposedProtobufStore) *DataStore

Composes a new DataStore.

func NewDataStore

func NewDataStore(cfg *Config, metricsScope promutils.Scope) (s *DataStore, err error)

Creates a new Data Store with the supplied config.

Example
testScope := promutils.NewTestScope()
ctx := context.Background()
fmt.Println("Creating in memory data store.")
store, err := NewDataStore(&Config{
	Type: TypeMemory,
}, testScope.NewSubScope("exp_new"))

if err != nil {
	fmt.Printf("Failed to create data store. Error: %v", err)
}

ref, err := store.ConstructReference(ctx, DataReference("root"), "subkey", "subkey2")
if err != nil {
	fmt.Printf("Failed to construct data reference. Error: %v", err)
}

fmt.Printf("Constructed data reference [%v] and writing data to it.\n", ref)

dataToStore := "hello world"
err = store.WriteRaw(ctx, ref, int64(len(dataToStore)), Options{}, strings.NewReader(dataToStore))
if err != nil {
	fmt.Printf("Failed to write data. Error: %v", err)
}
Output:

Creating in memory data store.
Constructed data reference [/root/subkey/subkey2] and writing data to it.

type DefaultProtobufStore

type DefaultProtobufStore struct {
	RawStore
	// contains filtered or unexported fields
}

Implements ProtobufStore to marshal and unmarshal protobufs to/from a RawStore

func NewDefaultProtobufStore

func NewDefaultProtobufStore(store RawStore, metricsScope promutils.Scope) DefaultProtobufStore

func (DefaultProtobufStore) ReadProtobuf

func (s DefaultProtobufStore) ReadProtobuf(ctx context.Context, reference DataReference, msg proto.Message) error

func (DefaultProtobufStore) WriteProtobuf

func (s DefaultProtobufStore) WriteProtobuf(ctx context.Context, reference DataReference, opts Options, msg proto.Message) error

type InMemoryStore

type InMemoryStore struct {
	// contains filtered or unexported fields
}

func (*InMemoryStore) Clear

func (s *InMemoryStore) Clear(ctx context.Context) error

func (InMemoryStore) CopyRaw

func (c InMemoryStore) CopyRaw(ctx context.Context, source, destination DataReference, opts Options) error

A naiive implementation for copy that reads all data locally then writes them to destination. TODO: We should upstream an API change to stow to implement copy more natively. E.g. Use s3 copy:

https://docs.aws.amazon.com/AmazonS3/latest/dev/CopyingObjectUsingREST.html

func (*InMemoryStore) GetBaseContainerFQN

func (s *InMemoryStore) GetBaseContainerFQN(ctx context.Context) DataReference

func (*InMemoryStore) Head

func (s *InMemoryStore) Head(ctx context.Context, reference DataReference) (Metadata, error)

func (*InMemoryStore) ReadRaw

func (s *InMemoryStore) ReadRaw(ctx context.Context, reference DataReference) (io.ReadCloser, error)

func (*InMemoryStore) WriteRaw

func (s *InMemoryStore) WriteRaw(ctx context.Context, reference DataReference, size int64, opts Options, raw io.Reader) (
	err error)

type LimitsConfig

type LimitsConfig struct {
	GetLimitMegabytes int64 `json:"maxDownloadMBs" pflag:",Maximum allowed download size (in MBs) per call."`
}

Specifies limits for storage package.

type MemoryMetadata

type MemoryMetadata struct {
	// contains filtered or unexported fields
}

func (MemoryMetadata) Exists

func (m MemoryMetadata) Exists() bool

func (MemoryMetadata) Size

func (m MemoryMetadata) Size() int64

type Metadata

type Metadata interface {
	Exists() bool
	Size() int64
}

Placeholder for data reference metadata.

type Options

type Options struct {
	Metadata map[string]interface{}
}

Holder for recording storage options. It is used to pass Metadata (like headers for S3) and also tags or labels for objects

type ProtobufStore

type ProtobufStore interface {
	// Retrieves the entire blob from blobstore and unmarshals it to the passed protobuf
	ReadProtobuf(ctx context.Context, reference DataReference, msg proto.Message) error

	// Serializes and stores the protobuf.
	WriteProtobuf(ctx context.Context, reference DataReference, opts Options, msg proto.Message) error
}

Defines an interface for reading and writing protobuf messages

type RawStore

type RawStore interface {
	// returns a FQN DataReference with the configured base init container
	GetBaseContainerFQN(ctx context.Context) DataReference

	// Gets metadata about the reference. This should generally be a light weight operation.
	Head(ctx context.Context, reference DataReference) (Metadata, error)

	// Retrieves a byte array from the Blob store or an error
	ReadRaw(ctx context.Context, reference DataReference) (io.ReadCloser, error)

	// Stores a raw byte array.
	WriteRaw(ctx context.Context, reference DataReference, size int64, opts Options, raw io.Reader) error

	// Copies from source to destination.
	CopyRaw(ctx context.Context, source, destination DataReference, opts Options) error
}

Defines a low level interface for accessing and storing bytes.

func NewInMemoryRawStore

func NewInMemoryRawStore(_ *Config, scope promutils.Scope) (RawStore, error)

type ReferenceConstructor

type ReferenceConstructor interface {
	// Creates a new dataReference that matches the storage structure.
	ConstructReference(ctx context.Context, reference DataReference, nestedKeys ...string) (DataReference, error)
}

Defines an interface for building data reference paths.

type StowMetadata

type StowMetadata struct {
	// contains filtered or unexported fields
}

func (StowMetadata) Exists

func (s StowMetadata) Exists() bool

func (StowMetadata) Size

func (s StowMetadata) Size() int64

type StowStore

type StowStore struct {
	stow.Container
	// contains filtered or unexported fields
}

Implements DataStore to talk to stow location store.

func NewStowRawStore

func NewStowRawStore(containerBaseFQN DataReference, container stow.Container, metricsScope promutils.Scope) (*StowStore, error)

func (StowStore) CopyRaw

func (c StowStore) CopyRaw(ctx context.Context, source, destination DataReference, opts Options) error

A naiive implementation for copy that reads all data locally then writes them to destination. TODO: We should upstream an API change to stow to implement copy more natively. E.g. Use s3 copy:

https://docs.aws.amazon.com/AmazonS3/latest/dev/CopyingObjectUsingREST.html

func (*StowStore) GetBaseContainerFQN

func (s *StowStore) GetBaseContainerFQN(ctx context.Context) DataReference

func (*StowStore) Head

func (s *StowStore) Head(ctx context.Context, reference DataReference) (Metadata, error)

func (*StowStore) ReadRaw

func (s *StowStore) ReadRaw(ctx context.Context, reference DataReference) (io.ReadCloser, error)

func (*StowStore) WriteRaw

func (s *StowStore) WriteRaw(ctx context.Context, reference DataReference, size int64, opts Options, raw io.Reader) error

type Type

type Type = string

Defines the storage config type.

const (
	TypeMemory Type = "mem"
	TypeS3     Type = "s3"
	TypeLocal  Type = "local"
	TypeMinio  Type = "minio"
)

type URLPathConstructor

type URLPathConstructor struct {
}

Implements ReferenceConstructor that assumes paths are URL-compatible.

func (URLPathConstructor) ConstructReference

func (URLPathConstructor) ConstructReference(ctx context.Context, reference DataReference, nestedKeys ...string) (DataReference, error)

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL