Documentation
¶
Index ¶
- type Error
- type Lexer
- type Parser
- type Position
- type Scanner
- func (s *Scanner) CollectAll() (TokenSlice, error)
- func (s *Scanner) CollectN(n int) (TokenSlice, error)
- func (s *Scanner) CollectUntil(predicate func(Token) bool) (TokenSlice, error)
- func (s *Scanner) CollectWhile(predicate func(Token) bool) (TokenSlice, error)
- func (s *Scanner) NextToken() (Token, error)
- func (s *Scanner) PeekN(n int) (TokenSlice, error)
- func (s *Scanner) PeekToken() (Token, error)
- func (s *Scanner) Pos() int
- func (s *Scanner) Reset()
- func (s *Scanner) Rewind(n int)
- func (s *Scanner) SkipWhile(predicate func(Token) bool) (int, error)
- func (s *Scanner) Valid() error
- type Token
- type TokenSlice
- type TokenType
- type Value
- type Values
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Lexer ¶
type Lexer struct {
// contains filtered or unexported fields
}
Lexer performs RFC3986 lexical analysis on query strings
type Scanner ¶
type Scanner struct {
// contains filtered or unexported fields
}
Scanner provides a token-by-token access to the query string
func NewScanner ¶
NewScanner creates a new scanner for the query string
func (*Scanner) CollectAll ¶
func (s *Scanner) CollectAll() (TokenSlice, error)
CollectAll reads all remaining tokens into a slice
func (*Scanner) CollectN ¶
func (s *Scanner) CollectN(n int) (TokenSlice, error)
CollectN collects exactly n tokens Returns error if fewer than n tokens are available
func (*Scanner) CollectUntil ¶
func (s *Scanner) CollectUntil(predicate func(Token) bool) (TokenSlice, error)
CollectUntil collects tokens until the preciate returns true The Token that fails the predicate is left unconsumed
func (*Scanner) CollectWhile ¶
func (s *Scanner) CollectWhile(predicate func(Token) bool) (TokenSlice, error)
CollectWhile collects tokens while the predicate returns true The Token that fails the predicate is left unconsumed
func (*Scanner) PeekN ¶
func (s *Scanner) PeekN(n int) (TokenSlice, error)
PeekN returns the next n tokens without consuming them
type Token ¶
type Token struct {
Type TokenType
Value string // Raw value (percent-encoded if present)
Decoded string // Decoded value for percent-encoded tokens
Start Position
End Position
}
Token represents a lexical token in a query string
type TokenSlice ¶
type TokenSlice []Token
func (TokenSlice) Bytes ¶
func (ts TokenSlice) Bytes() []byte
Bytes returns the raw byte representation
func (TokenSlice) SplitSubDelimiter ¶ added in v0.4.0
func (ts TokenSlice) SplitSubDelimiter(del string) []TokenSlice
func (TokenSlice) String ¶
func (ts TokenSlice) String() string
String reconstructs the original query string implemtation of Stringer interface
func (TokenSlice) StringDecoded ¶
func (ts TokenSlice) StringDecoded() string
StringDecoded reconstructs the fully decoded query string
type Value ¶
type Value struct {
// the decoded value
Value string
// Whether this key was seen multiple times
HasMultiple bool
// Positions information for precise error report
KeyPos Position
ValuePos Position
// Original Tokens for inspection
KeyTokens TokenSlice
ValueTokens TokenSlice
}
Value represent a parsed query value with metadata
type Values ¶
type Values struct {
// contains filtered or unexported fields
}
Values is a collection of parsed query params It preserves insertion order and allow duplicate keys