obfuscate

package
v0.0.0-...-3257545 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 15, 2021 License: Apache-2.0 Imports: 17 Imported by: 0

Documentation

Overview

Package obfuscate implements quantizing and obfuscating of tags and resources for a set of spans matching a certain criteria.

Index

Constants

View Source
const (
	LexError = TokenKind(57346) + iota

	ID
	Limit
	Null
	String
	DoubleQuotedString
	Number
	BooleanLiteral
	ValueArg
	ListArg
	Comment
	Variable
	Savepoint
	PreparedStatement
	EscapeSequence
	NullSafeEqual
	LE
	GE
	NE
	As
	From
	Update
	Insert
	Into
	Join
	TableName
	ColonCast

	// FilteredGroupable specifies that the given token has been discarded by one of the
	// token filters and that it is groupable together with consecutive FilteredGroupable
	// tokens.
	FilteredGroupable

	// FilteredGroupableParenthesis is a parenthesis marked as filtered groupable. It is the
	// beginning of either a group of values ('(') or a nested query. We track is as
	// a special case for when it may start a nested query as opposed to just another
	// value group to be obfuscated.
	FilteredGroupableParenthesis

	// Filtered specifies that the token is a comma and was discarded by one
	// of the filters.
	Filtered

	// FilteredBracketedIdentifier specifies that we are currently discarding
	// a bracketed identifier (MSSQL).
	// See issue https://github.com/DataDog/datadog-trace-agent/issues/475.
	FilteredBracketedIdentifier
)

list of available tokens; this list has been reduced because we don't need a full-fledged tokenizer to implement a Lexer

View Source
const EndChar = unicode.MaxRune + 1

EndChar is used to signal that the scanner has finished reading the query. This happens when there are no more characters left in the query or when invalid encoding is discovered. EndChar is an invalid rune value that can not be found in any valid string.

Variables

This section is empty.

Functions

This section is empty.

Types

type ObfuscatedQuery

type ObfuscatedQuery struct {
	Query     string // the obfuscated SQL query
	TablesCSV string // comma-separated list of tables that the query addresses
}

ObfuscatedQuery specifies information about an obfuscated SQL query.

func (*ObfuscatedQuery) Cost

func (oq *ObfuscatedQuery) Cost() int64

Cost returns the number of bytes needed to store all the fields of this ObfuscatedQuery.

type Obfuscator

type Obfuscator struct {
	// contains filtered or unexported fields
}

Obfuscator quantizes and obfuscates spans. The obfuscator is not safe for concurrent use.

func NewObfuscator

NewObfuscator creates a new obfuscator

func (*Obfuscator) Obfuscate

func (o *Obfuscator) Obfuscate(span *pb.Span)

Obfuscate may obfuscate span's properties based on its type and on the Obfuscator's configuration.

func (*Obfuscator) ObfuscateSQLExecPlan

func (o *Obfuscator) ObfuscateSQLExecPlan(jsonPlan string, normalize bool) (string, error)

ObfuscateSQLExecPlan obfuscates query conditions in the provided JSON encoded execution plan. If normalize=True, then cost and row estimates are also obfuscated away.

func (*Obfuscator) ObfuscateSQLString

func (o *Obfuscator) ObfuscateSQLString(in string) (*ObfuscatedQuery, error)

ObfuscateSQLString quantizes and obfuscates the given input SQL query string. Quantization removes some elements such as comments and aliases and obfuscation attempts to hide sensitive information in strings and numbers by redacting them.

func (*Obfuscator) ObfuscateStatsGroup

func (o *Obfuscator) ObfuscateStatsGroup(b *pb.ClientGroupedStats)

ObfuscateStatsGroup obfuscates the given stats bucket group.

func (*Obfuscator) QuantizeRedisString

func (*Obfuscator) QuantizeRedisString(query string) string

QuantizeRedisString returns a quantized version of a Redis query.

TODO(gbbr): Refactor this method to use the tokenizer and remove "compactWhitespaces". This method is buggy when commands contain quoted strings with newlines.

func (*Obfuscator) SQLLiteralEscapes

func (o *Obfuscator) SQLLiteralEscapes() bool

SQLLiteralEscapes reports whether escape characters should be treated literally by the SQL obfuscator.

func (*Obfuscator) SetSQLLiteralEscapes

func (o *Obfuscator) SetSQLLiteralEscapes(ok bool)

SetSQLLiteralEscapes sets whether or not escape characters should be treated literally by the SQL obfuscator.

func (*Obfuscator) Stop

func (o *Obfuscator) Stop()

Stop cleans up after a finished Obfuscator.

type SQLTokenizer

type SQLTokenizer struct {
	// contains filtered or unexported fields
}

SQLTokenizer is the struct used to generate SQL tokens for the parser.

func NewSQLTokenizer

func NewSQLTokenizer(sql string, literalEscapes bool) *SQLTokenizer

NewSQLTokenizer creates a new SQLTokenizer for the given SQL string. The literalEscapes argument specifies whether escape characters should be treated literally or as such.

func (*SQLTokenizer) Err

func (tkn *SQLTokenizer) Err() error

Err returns the last error that the tokenizer encountered, or nil.

func (*SQLTokenizer) Reset

func (tkn *SQLTokenizer) Reset(in string)

Reset the underlying buffer and positions

func (*SQLTokenizer) Scan

func (tkn *SQLTokenizer) Scan() (TokenKind, []byte)

Scan scans the tokenizer for the next token and returns the token type and the token buffer.

func (*SQLTokenizer) SeenEscape

func (tkn *SQLTokenizer) SeenEscape() bool

SeenEscape returns whether or not this tokenizer has seen an escape character within a scanned string

type SyntaxError

type SyntaxError struct {
	Offset int64 // error occurred after reading Offset bytes
	// contains filtered or unexported fields
}

A SyntaxError is a description of a JSON syntax error.

func (*SyntaxError) Error

func (e *SyntaxError) Error() string

type TokenKind

type TokenKind uint32

TokenKind specifies the type of the token being scanned. It may be one of the defined constants below or in some cases the actual rune itself.

func (TokenKind) String

func (k TokenKind) String() string

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL