Documentation
¶
Overview ¶
Package token defines the Token struct and object pool for SQL lexical analysis.
A Token is the fundamental unit produced by the GoSQLX tokenizer: it pairs a models.TokenType integer constant (e.g., models.TokenTypeSelect, models.TokenTypeIdent) with the raw literal string from the source SQL. The integer-based TokenType taxonomy covers all SQL token categories — DML keywords (SELECT, INSERT, UPDATE, DELETE), DDL keywords (CREATE, ALTER, DROP), punctuation, operators, literals, and identifiers. The legacy string-based token.Type was removed in #215; all code should use models.TokenType.
The package also provides an object pool (Get / Put) for zero-allocation token reuse in hot paths such as batch parsing or high-throughput server workloads. Every token obtained with Get must be returned with a deferred Put to avoid memory leaks.
Token Structure ¶
type Token struct {
Type models.TokenType // Integer token type constant (primary, O(1) comparison)
Literal string // Raw literal value from the SQL source
}
Basic Usage ¶
import (
"github.com/ajitpratap0/GoSQLX/pkg/sql/token"
"github.com/ajitpratap0/GoSQLX/pkg/models"
)
tok := token.Token{Type: models.TokenTypeSelect, Literal: "SELECT"}
if tok.IsType(models.TokenTypeSelect) {
fmt.Println("This is a SELECT token")
}
if tok.IsAnyType(models.TokenTypeSelect, models.TokenTypeInsert) {
fmt.Println("This is a DML statement")
}
Token Pool ¶
The package provides an object pool for zero-allocation token reuse:
tok := token.Get() defer token.Put(tok) // MANDATORY — return to pool when done tok.Type = models.TokenTypeSelect tok.Literal = "SELECT"
See Also ¶
- pkg/models: Core TokenType constants and the TokenTypeUnknown sentinel
- pkg/sql/keywords: Keyword-to-TokenType mapping for all SQL dialects
- pkg/sql/tokenizer: SQL lexical analysis that produces Token values
- pkg/sql/parser: Recursive descent parser that consumes Token values
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
Types ¶
type Token ¶
type Token struct {
Type models.TokenType // Int-based token type (primary, for performance)
Literal string // The literal value of the token
}
Token represents a lexical token in SQL source code.
The Token struct uses the unified integer-based type system (models.TokenType) for all type identification. String-based token types have been removed as part of the token type unification (#215).
Example:
tok := Token{
Type: models.TokenTypeSelect,
Literal: "SELECT",
}
if tok.IsType(models.TokenTypeSelect) {
// Process SELECT token
}
func Get ¶
func Get() *Token
Get retrieves a Token from the pool. The token is pre-initialized with zero values. Always use defer to return the token to the pool when done.
func (Token) HasType ¶ added in v1.9.3
HasType returns true if the Type field is populated with a valid type. Returns false for TokenTypeUnknown or zero value.
Example:
tok := Token{Type: models.TokenTypeSelect, Literal: "SELECT"}
if tok.HasType() {
// Use fast Type-based operations
}
func (Token) IsAnyType ¶ added in v1.6.0
IsAnyType checks if the token matches any of the given models.TokenType values. Returns true if the token's Type matches any type in the provided list.
Example:
tok := Token{Type: models.TokenTypeSelect, Literal: "SELECT"}
dmlKeywords := []models.TokenType{
models.TokenTypeSelect,
models.TokenTypeInsert,
models.TokenTypeUpdate,
models.TokenTypeDelete,
}
if tok.IsAnyType(dmlKeywords...) {
fmt.Println("This is a DML statement keyword")
}
func (Token) IsType ¶ added in v1.6.0
IsType checks if the token matches the given models.TokenType. This uses fast integer comparison and is the preferred way to check token types.
Example:
tok := Token{Type: models.TokenTypeSelect, Literal: "SELECT"}
if tok.IsType(models.TokenTypeSelect) {
fmt.Println("This is a SELECT token")
}