pseudonymization

package
v0.0.0-...-982e07a Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Dec 19, 2023 License: Apache-2.0 Imports: 13 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

View Source
var ErrDataTypeMismatch = errors.New("requested TokenType not match stored TokenType")

ErrDataTypeMismatch used to show that required data type not equal to serializaed data type of stored value

View Source
var ErrGenerationRandomValue = errors.New("can't generate new random value, try count exceed")

ErrGenerationRandomValue return when can't new random value which wasn't generated before and exceed count of tries to generate another value

Functions

func NewPseudoanonymizer

func NewPseudoanonymizer(storage common.TokenStorage) (common.Pseudoanonymizer, error)

NewPseudoanonymizer create, initialize and return new instance of Pseudoanonymizer

Types

type DataTokenizer

type DataTokenizer struct {
	// contains filtered or unexported fields
}

DataTokenizer tokenizes and detokenizes data buffers.

func NewDataTokenizer

func NewDataTokenizer(tokenizer common.Pseudoanonymizer) (*DataTokenizer, error)

NewDataTokenizer makes a new data buffer tokenizer based on provided pseudoanonymizer.

func (*DataTokenizer) Detokenize

func (t *DataTokenizer) Detokenize(data []byte, context common.TokenContext, setting config.ColumnEncryptionSetting) ([]byte, error)

Detokenize the data in given context with provided settings.

func (*DataTokenizer) Tokenize

func (t *DataTokenizer) Tokenize(data []byte, context common.TokenContext, setting config.ColumnEncryptionSetting) ([]byte, error)

Tokenize the data in given context with provided settings.

type TokenEncryptor

type TokenEncryptor struct {
	// contains filtered or unexported fields
}

TokenEncryptor adds hash prefix to AcraStruct generated with encryptor.AcrawriterDataEncryptor

func NewTokenEncryptor

func NewTokenEncryptor(tokenizer *DataTokenizer) (*TokenEncryptor, error)

NewTokenEncryptor return new TokenEncryptor

func (*TokenEncryptor) EncryptWithClientID

func (e *TokenEncryptor) EncryptWithClientID(clientID, data []byte, setting configCE.ColumnEncryptionSetting) ([]byte, error)

EncryptWithClientID tokenize data according to setting

type TokenProcessor

type TokenProcessor struct {
	// contains filtered or unexported fields
}

TokenProcessor implements processor which tokenize/detokenize data for acra-server used in decryptor module

func NewTokenProcessor

func NewTokenProcessor(tokenizer *DataTokenizer) (*TokenProcessor, error)

NewTokenProcessor return new processor

func (*TokenProcessor) ID

func (p *TokenProcessor) ID() string

ID return name of processor

func (*TokenProcessor) OnColumn

func (p *TokenProcessor) OnColumn(ctx context.Context, data []byte) (context.Context, []byte, error)

OnColumn tokenize data if configured by encryptor config

type TokenizeQuery

type TokenizeQuery struct {
	// contains filtered or unexported fields
}

TokenizeQuery replace tokenized data inside AcraStruct/AcraBlocks and change WHERE conditions to support searchable tokenization

func NewMySQLTokenizeQuery

func NewMySQLTokenizeQuery(schemaStore config.TableSchemaStore, tokenEncryptor *TokenEncryptor) *TokenizeQuery

NewMySQLTokenizeQuery return TokenizeQuery with coder for mysql

func NewPostgresqlTokenizeQuery

func NewPostgresqlTokenizeQuery(schemaStore config.TableSchemaStore, tokenEncryptor *TokenEncryptor) *TokenizeQuery

NewPostgresqlTokenizeQuery return TokenizeQuery with coder for postgresql

func (*TokenizeQuery) ID

func (encryptor *TokenizeQuery) ID() string

ID returns name of this QueryObserver.

func (*TokenizeQuery) OnBind

func (encryptor *TokenizeQuery) OnBind(ctx context.Context, statement sqlparser.Statement, values []base.BoundValue) ([]base.BoundValue, bool, error)

OnBind processes bound values for prepared statements.

Searchable tokenization rewrites WHERE clauses with equality comparisons like this:

WHERE column = 'value'   ===>   WHERE column = tokenize('value')

If the query is a parameterized prepared query then OnQuery() rewriting yields this:

WHERE column = $1        ===>   WHERE column = tokenize($1)

and actual "value" is passed via parameters, visible here in OnBind().

func (*TokenizeQuery) OnQuery

func (encryptor *TokenizeQuery) OnQuery(ctx context.Context, query base.OnQueryObject) (base.OnQueryObject, bool, error)

OnQuery processes query text before database sees it.

Tokenized searchable encryption rewrites WHERE clauses with equality comparisons like this:

WHERE column = 'value'   ===>   WHERE column = tokenize('value')

If the query is a parameterized prepared query then OnQuery() rewriting yields this:

WHERE column = $1        ===>   WHERE column = tokenize($1)

and actual "value" is passed via parameters later. See OnBind() for details.

Directories

Path Synopsis

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL