Documentation
¶
Overview ¶
Package parse implements lexical analysis and parsing for Lua source code.
The package provides two main components:
Scanner (lexer): Tokenizes Lua source into a stream of tokens. Handles all Lua lexical elements including identifiers, keywords, numbers (decimal and hex), strings (single/double quoted and multiline), operators, and comments. Also supports type annotation syntax extensions.
Parser: Consumes the token stream and produces an AST. Generated by goyacc from parser.go.y, it implements recursive descent parsing for Lua's grammar including statements, expressions, and type annotations.
Key entry points:
Parse(reader, name) - parse from io.Reader ParseString(source, name) - parse from string with enhanced error messages
Errors include position information and can render Rust-style diagnostics with source context when the original source is available.
Index ¶
Constants ¶
const EOF = -1
const T2Colon = 57384
const T2Comma = 57382
const T3Comma = 57383
const TAnd = 57346
const TArrow = 57392
const TAs = 57371
const TAsserts = 57372
const TBang = 57394
const TBreak = 57347
const TDo = 57348
const TElse = 57349
const TElseIf = 57350
const TEnd = 57351
const TEqeq = 57378
const TExtends = 57376
const TFalse = 57352
const TFor = 57353
const TFun = 57377
const TFunction = 57354
const TGoto = 57367
const TGte = 57381
const TIdent = 57386
const TIdiv = 57391
const TIf = 57355
const TIn = 57356
const TInterface = 57369
const TIs = 57373
const TKeyof = 57375
const TLabel = 57385
const TLocal = 57357
const TLte = 57380
const TNeq = 57379
const TNil = 57358
const TNot = 57359
const TNumber = 57387
const TOr = 57360
const TQuestion = 57393
const TQuestionColon = 57395
const TReadonly = 57370
const TRepeat = 57362
const TReturn = 57361
const TShl = 57389
const TShr = 57390
const TString = 57388
const TThen = 57363
const TTrue = 57364
const TType = 57368
const TTypeof = 57374
const TUntil = 57365
const TWhile = 57366
const UNARY = 57396
Variables ¶
This section is empty.
Functions ¶
func FriendlyTokenName ¶
FriendlyTokenName converts a token type to a human-readable name for error messages.
func ParseString ¶
ParseString parses Lua source from a string. Errors include source context for enhanced diagnostic rendering.
Types ¶
type Error ¶
Error represents a parse error with source location. When Source is set, Render() produces Rust-style diagnostics with context.
type Lexer ¶
type Lexer struct {
Stmts []ast.Stmt
PNewLine bool
Token ast.Token
PrevTokenType int
PendingGT *ast.Token // Pending '>' from split '>>'
// contains filtered or unexported fields
}
Lexer implements the yyLexer interface for goyacc.