s3store

package
v0.26.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jun 28, 2020 License: MIT Imports: 35 Imported by: 0

README

Amazon S3 blobstore

Configuration

See https://docs.aws.amazon.com/sdk-for-go/v1/developer-guide/configuring-sdk.html#specifying-credentials on how to set up authentication against s3

Enabling archival is done by using the configuration below. Region and bucket URI are required

archival:
  history:
    status: "enabled"
    enableRead: true
    provider:
      s3store:
        region: "us-east-1"
  visibility:
    status: "enabled"
    enableRead: true
    provider:
      s3store:
        region: "us-east-1"

namespaceDefaults:
  archival:
    history:
      status: "enabled"
      URI: "s3://<bucket-name>"
    visibility:
      status: "enabled"
      URI: "s3://<bucket-name>"

Visibility query syntax

You can query the visibility store by using the tctl workflow listarchived command

The syntax for the query is based on SQL

Supported column names are

  • WorkflowID String
  • WorkflowTypeName String
  • StartTime Date
  • CloseTime Date
  • SearchPrecision String - Day, Hour, Minute, Second

WorkflowID or WorkflowTypeName is required. If filtering on date use StartTime or CloseTime in combination with SearchPrecision.

Searching for a record will be done in times in the UTC timezone

SearchPrecision specifies what range you want to search for records. If you use SearchPrecision = 'Day' it will search all records starting from 2020-01-21T00:00:00Z to 2020-01-21T59:59:59Z

Limitations
  • The only operator supported is = due to how records are stored in s3.
Example

Searches for all records done in day 2020-01-21 with the specified workflow id

./tctl --ns samples-namespace workflow listarchived -q "StartTime = '2020-01-21T00:00:00Z' AND WorkflowID='workflow-id' AND SearchPrecision='Day'"

Storage in S3

Workflow runs are stored in s3 using the following structure

s3://<bucket-name>/<namespace-id>/
	history/<workflow-id>/<run-id>
	visibility/
            workflowTypeName/<workflow-type-name>/
                startTimeout/2020-01-21T16:16:11Z/<run-id>
                closeTimeout/2020-01-21T16:16:11Z/<run-id>
            workflowID/<workflow-id>/
                startTimeout/2020-01-21T16:16:11Z/<run-id>
                closeTimeout/2020-01-21T16:16:11Z/<run-id>

Using localstack for local development

  1. Install awscli from here
  2. Install localstack from here
  3. Launch localstack with SERVICES=s3 localstack start
  4. Create a bucket using aws --endpoint-url=http://localhost:4572 s3 mb s3://temporal-development
  5. Configure archival and namespaceDefaults with the following configuration
archival:
  history:
    status: "enabled"
    enableRead: true
    provider:
      s3store:
        region: "us-east-1"
        endpoint: "http://127.0.0.1:4572"
        s3ForcePathStyle: true
  visibility:
    status: "enabled"
    enableRead: true
    provider:
      s3store:
        region: "us-east-1"
        endpoint: "http://127.0.0.1:4572"
        s3ForcePathStyle: true

namespaceDefaults:
  archival:
    history:
      status: "enabled"
      URI: "s3://temporal-development"
    visibility:
      status: "enabled"
      URI: "s3://temporal-development"

Documentation

Overview

Package s3store is a generated GoMock package.

Index

Constants

View Source
const (
	WorkflowTypeName = "WorkflowTypeName"
	WorkflowID       = "WorkflowId"
	StartTime        = "StartTime"
	CloseTime        = "CloseTime"
	SearchPrecision  = "SearchPrecision"
)

All allowed fields for filtering

View Source
const (
	PrecisionDay    = "Day"
	PrecisionHour   = "Hour"
	PrecisionMinute = "Minute"
	PrecisionSecond = "Second"
)

Precision specific values

View Source
const (
	// URIScheme is the scheme for the s3 implementation
	URIScheme = "s3"
)

Variables

This section is empty.

Functions

func NewHistoryArchiver

func NewHistoryArchiver(
	container *archiver.HistoryBootstrapContainer,
	config *config.S3Archiver,
) (archiver.HistoryArchiver, error)

NewHistoryArchiver creates a new archiver.HistoryArchiver based on s3

func NewVisibilityArchiver

func NewVisibilityArchiver(
	container *archiver.VisibilityBootstrapContainer,
	config *config.S3Archiver,
) (archiver.VisibilityArchiver, error)

NewVisibilityArchiver creates a new archiver.VisibilityArchiver based on s3

Types

type MockQueryParser

type MockQueryParser struct {
	// contains filtered or unexported fields
}

MockQueryParser is a mock of QueryParser interface.

func NewMockQueryParser

func NewMockQueryParser(ctrl *gomock.Controller) *MockQueryParser

NewMockQueryParser creates a new mock instance.

func (*MockQueryParser) EXPECT

EXPECT returns an object that allows the caller to indicate expected use.

func (*MockQueryParser) Parse

func (m *MockQueryParser) Parse(query string) (*parsedQuery, error)

Parse mocks base method.

type MockQueryParserMockRecorder

type MockQueryParserMockRecorder struct {
	// contains filtered or unexported fields
}

MockQueryParserMockRecorder is the mock recorder for MockQueryParser.

func (*MockQueryParserMockRecorder) Parse

func (mr *MockQueryParserMockRecorder) Parse(query interface{}) *gomock.Call

Parse indicates an expected call of Parse.

type QueryParser

type QueryParser interface {
	Parse(query string) (*parsedQuery, error)
}

QueryParser parses a limited SQL where clause into a struct

func NewQueryParser

func NewQueryParser() QueryParser

NewQueryParser creates a new query parser for filestore

Directories

Path Synopsis

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL