fql

package module
v0.0.1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Sep 28, 2020 License: MIT Imports: 8 Imported by: 1

README

FQL - Filtering Query Language
--------------------------------------------------------------------------------

Example: "status:active;createdAt:>d1483228800"

--------------------------------------------------------------------------------

Syntax:

A filter expression consists of one or more filter rules or filter groups separated by logical operators.
A filter group is a filter expression wrapped in parentheses "( ... )" which can be used to enforce the order of evaluation.
The logical operators can be one of the following:
	- ";" represents the logical AND operator.
	- "," represents the logical OR operator.

A filter rule consists of a key, the key-value separator, an optional relational operator, and a value in that order.
The key-value separator is the colon character ":".
The rule key represents a resource's field on which the filter rule should be applied. Keys can contain only alphanumeric characters and the underscore character "[_0-9a-Z]".
The rule value represents the value that will be compared against a resource's field indicated by the rule key using the relational operator. The value can be one of the following:
	- "null"
	- a boolean: either "true" or "false".
	- a number: integer or float, can optionally be preceded by a unary operator "-" or "+".
	- a string: delimited by double quotes "<string>", can also contain escaped double quotes "\"".
	- a timestamp: denoted by a preceding "d" (for date) and represented by the Unix timestamp; that is, the number of seconds since the Unix Epoch on January 1st, 1970 UTC. For example the value "d1483228800" represents "2017-01-01 00:00:00 +0000 UTC".

The relational operator is used to compare the rule value against a resource's field. If no operator is provided the default "=" (is equal) is assumed. The relational operator can be one of the following, depending on the value:
	- ">"
	- "<"
	- ">="
	- "<="
	- "!"

NOTE: the operators ">", "<", ">=", and "<=" are NOT applicable with "null", "boolean", or "string" values. Attempting to use these operators with values of the aforementioned types will result in an error being returned by the tokenizer.

--------------------------------------------------------------------------------

The grammar in EBNF:

filter_expr  = ( filter_rule | filter_group ) { seq_op filter_expr } ;
filter_group = '(' filter_expr ')' ;
		
filter_rule = filter_key ':' [ rel_op ] filter_val ;
filter_key  = identifier ;
filter_val  = bool_val | num_val | time_val | text_val | null_val ;
		
bool_val  = 'true' | 'false' ;
num_val   = [ unary_op ] ( int_val | float_val ) ;
time_val  = 'd' [ unary_op ] int_val ;
text_val  = interpreted_string_lit ;
null_val  = 'null' ;
int_val   = '0' | (( digit - '0' ) { digit } ) ;
float_val = float_lit ;
		
rel_op   = '>' | '>=' | '<' | '<=' | '!' ;
seq_op   = ',' | ';' ;
unary_op = '+' | '-' ;
		
digit = '0' | '1' | '2' | '3' | '4' | '5' | '6' | '7' | '8' | '9' ;


The definitions of the following rules can be found in Go's language spec.

	identifier = .
	interpreted_string_lit = .
	float_lit = .

Documentation

Overview

Package fql implements a tokenizer for FQL.

The acronym FQL stands for Filtering Query Language which is a custom syntax for describing how to filter a requested resource. The intended method for sending fql to a server is to pass it as a query parameter value in the request's URL.

https://www.example.com/resource?f=<fqlString>

Tokenization is done by creating a Tokenizer for a string that contains the fql.

z := fql.NewTokenizer(fqlString)

Given a Tokenizer z, the FQL is tokenized by repeatedly calling z.Next(), which parses the next token and returns it, or an error. If the returned token is fql.RULE, then calling z.Rule() will return the parsed *fql.Rule value that is associated with that token.

for {
	tok, err := z.Next()
	if err != nil {
		if err != fql.EOF {
			return err
		}
		break
	}

	switch tok {
	case fql.LPAREN:
		// ...
	case fql.RPAREN:
		// ...
	case fql.AND:
		// ...
	case fql.OR:
		// ...
	case fql.RULE:
		r := z.Rule()
		// ...
	}
}

Index

Constants

View Source
const (
	Null nullbit = true // Null represents a "null" value in a filter rule.
)

Variables

View Source
var EOF = io.EOF

EOF is the error returned by the Tokenizer when no more input is available.

The io.EOF value is re-declared here only so that client code will not have to import the "io" package when using the "fql" package.

Functions

This section is empty.

Types

type CmpOp

type CmpOp uint32

CmpOp represents a comparison operator.

const (
	CmpEq CmpOp // ":" equality operator
	CmpNe       // ":!" inequality operator
	CmpGt       // ":>" greater-than operator
	CmpLt       // ":<" less-than operator
	CmpGe       // ":>=" greater-than-or-equal operator
	CmpLe       // ":<=" less-than-or-equal operator
)

type Error

type Error struct {
	Code         ErrorCode
	Pos          int
	Key          string
	Cmp          CmpOp
	Val          string
	LastToken    Token
	CurrentToken Token
}

func (*Error) Error

func (e *Error) Error() string

type ErrorCode

type ErrorCode uint32
const (
	ErrExtraClosingParen ErrorCode = 1 + iota
	ErrNoClosingParen
	ErrNoClosingDoubleQuote
	ErrNoRuleValue
	ErrBadBoolean
	ErrBadNumber
	ErrBadDuration
	ErrBadNullOp
	ErrBadBooleanOp
	ErrBadKey
	ErrBadTokenSequence
)

type Rule

type Rule struct {
	Key string
	Cmp CmpOp
	Val interface{}
}

A Rule represents a filter rule and consists of a Key, a CmpOp, and a Value.

type RuleError

type RuleError struct {
	Pos int
	Key string
	Cmp CmpOp
	Val string
}

TODO

type Token

type Token uint32

A Token is the set of lexical tokens of FQL.

const (
	LPAREN Token // (
	RPAREN       // )
	AND          // ;
	OR           // ,
	RULE         // <key>:[op]<value>
)

type Tokenizer

type Tokenizer struct {
	// contains filtered or unexported fields
}

A Tokenizer returns a stream of FQL Tokens.

func NewTokenizer

func NewTokenizer(fqlString string) *Tokenizer

NewTokenizer returns a new FQL Tokenizer for the given string.

func (*Tokenizer) Next

func (t *Tokenizer) Next() (Token, error)

Next scans, parses, and returns the next token.

func (*Tokenizer) Rule

func (t *Tokenizer) Rule() *Rule

Rule returns the current parsed Rule node, or nil if there isn't one.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL