obfuscate

package
v0.0.0-...-4646cf5 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Oct 16, 2020 License: Apache-2.0 Imports: 17 Imported by: 1

Documentation

Overview

Package obfuscate implements quantizing and obfuscating of tags and resources for a set of spans matching a certain criteria.

Index

Constants

View Source
const (
	LexError = TokenKind(57346) + iota

	ID
	Limit
	Null
	String
	DoubleQuotedString
	Number
	BooleanLiteral
	ValueArg
	ListArg
	Comment
	Variable
	Savepoint
	PreparedStatement
	EscapeSequence
	NullSafeEqual
	LE
	GE
	NE
	As
	From
	Update
	Insert
	Into
	Join
	ColonCast

	// FilteredGroupable specifies that the given token has been discarded by one of the
	// token filters and that it is groupable together with consecutive FilteredGroupable
	// tokens.
	FilteredGroupable

	// Filtered specifies that the token is a comma and was discarded by one
	// of the filters.
	Filtered

	// FilteredBracketedIdentifier specifies that we are currently discarding
	// a bracketed identifier (MSSQL).
	// See issue https://github.com/DataDog/datadog-trace-agent/issues/475.
	FilteredBracketedIdentifier
)

list of available tokens; this list has been reduced because we don't need a full-fledged tokenizer to implement a Lexer

View Source
const EOFChar = unicode.MaxRune + 1

EOFChar is used to signal that no more characters were found by the scanner. It is an invalid rune value that can not be found in any valid string.

Variables

This section is empty.

Functions

This section is empty.

Types

type ObfuscatedQuery

type ObfuscatedQuery struct {
	Query     string // the obfuscated SQL query
	TablesCSV string // comma-separated list of tables that the query addresses
}

ObfuscatedQuery specifies information about an obfuscated SQL query.

func (*ObfuscatedQuery) Cost

func (oq *ObfuscatedQuery) Cost() int64

Cost returns the number of bytes needed to store all the fields of this ObfuscatedQuery.

type Obfuscator

type Obfuscator struct {
	// contains filtered or unexported fields
}

Obfuscator quantizes and obfuscates spans. The obfuscator is not safe for concurrent use.

func NewObfuscator

func NewObfuscator(cfg *configdefs.ObfuscationConfig) *Obfuscator

NewObfuscator creates a new obfuscator from the provided config

func (*Obfuscator) Obfuscate

func (o *Obfuscator) Obfuscate(span *pb.Span)

Obfuscate may obfuscate span's properties based on its type and on the Obfuscator's configuration.

func (*Obfuscator) ObfuscateSQLString

func (o *Obfuscator) ObfuscateSQLString(in string) (*ObfuscatedQuery, error)

ObfuscateSQLString quantizes and obfuscates the given input SQL query string. Quantization removes some elements such as comments and aliases and obfuscation attempts to hide sensitive information in strings and numbers by redacting them.

func (*Obfuscator) SQLLiteralEscapes

func (o *Obfuscator) SQLLiteralEscapes() bool

SQLLiteralEscapes reports whether escape characters should be treated literally by the SQL obfuscator.

func (*Obfuscator) SetSQLLiteralEscapes

func (o *Obfuscator) SetSQLLiteralEscapes(ok bool)

SetSQLLiteralEscapes sets whether or not escape characters should be treated literally by the SQL obfuscator.

func (*Obfuscator) Stop

func (o *Obfuscator) Stop()

Stop cleans up after a finished Obfuscator.

type SQLTokenizer

type SQLTokenizer struct {
	// contains filtered or unexported fields
}

SQLTokenizer is the struct used to generate SQL tokens for the parser.

func NewSQLTokenizer

func NewSQLTokenizer(sql string, literalEscapes bool) *SQLTokenizer

NewSQLTokenizer creates a new SQLTokenizer for the given SQL string. The literalEscapes argument specifies whether escape characters should be treated literally or as such.

func (*SQLTokenizer) Err

func (tkn *SQLTokenizer) Err() error

Err returns the last error that the tokenizer encountered, or nil.

func (*SQLTokenizer) Reset

func (tkn *SQLTokenizer) Reset(in string)

Reset the underlying buffer and positions

func (*SQLTokenizer) Scan

func (tkn *SQLTokenizer) Scan() (TokenKind, []byte)

Scan scans the tokenizer for the next token and returns the token type and the token buffer.

func (*SQLTokenizer) SeenEscape

func (tkn *SQLTokenizer) SeenEscape() bool

SeenEscape returns whether or not this tokenizer has seen an escape character within a scanned string

type SyntaxError

type SyntaxError struct {
	Offset int64 // error occurred after reading Offset bytes
	// contains filtered or unexported fields
}

A SyntaxError is a description of a JSON syntax error.

func (*SyntaxError) Error

func (e *SyntaxError) Error() string

type TokenKind

type TokenKind uint32

TokenKind specifies the type of the token being scanned. It may be one of the defined constants below or in some cases the actual rune itself.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL