parse

package
v1.5.7 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 12, 2026 License: MIT Imports: 8 Imported by: 0

Documentation

Overview

Package parse implements lexical analysis and parsing for Lua source code.

The package provides two main components:

Scanner (lexer): Tokenizes Lua source into a stream of tokens. Handles all Lua lexical elements including identifiers, keywords, numbers (decimal and hex), strings (single/double quoted and multiline), operators, and comments. Also supports type annotation syntax extensions.

Parser: Consumes the token stream and produces an AST. Generated by goyacc from parser.go.y, it implements recursive descent parsing for Lua's grammar including statements, expressions, and type annotations.

Key entry points:

Parse(reader, name) - parse from io.Reader
ParseString(source, name) - parse from string with enhanced error messages

Errors include position information and can render Rust-style diagnostics with source context when the original source is available.

Index

Constants

View Source
const EOF = -1
View Source
const T2Colon = 57384
View Source
const T2Comma = 57382
View Source
const T3Comma = 57383
View Source
const TAnd = 57346
View Source
const TArrow = 57392
View Source
const TAs = 57371
View Source
const TAsserts = 57372
View Source
const TBang = 57394
View Source
const TBreak = 57347
View Source
const TDo = 57348
View Source
const TElse = 57349
View Source
const TElseIf = 57350
View Source
const TEnd = 57351
View Source
const TEqeq = 57378
View Source
const TExtends = 57376
View Source
const TFalse = 57352
View Source
const TFor = 57353
View Source
const TFun = 57377
View Source
const TFunction = 57354
View Source
const TGoto = 57367
View Source
const TGte = 57381
View Source
const TIdent = 57386
View Source
const TIdiv = 57391
View Source
const TIf = 57355
View Source
const TIn = 57356
View Source
const TInterface = 57369
View Source
const TIs = 57373
View Source
const TKeyof = 57375
View Source
const TLabel = 57385
View Source
const TLocal = 57357
View Source
const TLte = 57380
View Source
const TNeq = 57379
View Source
const TNil = 57358
View Source
const TNot = 57359
View Source
const TNumber = 57387
View Source
const TOr = 57360
View Source
const TQuestion = 57393
View Source
const TQuestionColon = 57395
View Source
const TReadonly = 57370
View Source
const TRepeat = 57362
View Source
const TReturn = 57361
View Source
const TShl = 57389
View Source
const TShr = 57390
View Source
const TString = 57388
View Source
const TThen = 57363
View Source
const TTrue = 57364
View Source
const TType = 57368
View Source
const TTypeof = 57374
View Source
const TUntil = 57365
View Source
const TWhile = 57366
View Source
const UNARY = 57396

Variables

This section is empty.

Functions

func FriendlyTokenName

func FriendlyTokenName(c int) string

FriendlyTokenName converts a token type to a human-readable name for error messages.

func Parse

func Parse(reader io.Reader, name string) (chunk []ast.Stmt, err error)

Parse reads Lua source from reader and returns the parsed statement list.

func ParseString

func ParseString(source, name string) (chunk []ast.Stmt, err error)

ParseString parses Lua source from a string. Errors include source context for enhanced diagnostic rendering.

func TokenName

func TokenName(c int) string

Types

type Error

type Error struct {
	Pos     ast.Position
	Message string
	Token   string
	Source  string
}

Error represents a parse error with source location. When Source is set, Render() produces Rust-style diagnostics with context.

func (*Error) Error

func (e *Error) Error() string

func (*Error) Render

func (e *Error) Render() string

Render formats the error with source context showing the offending line and position.

func (*Error) String

func (e *Error) String() string

type Lexer

type Lexer struct {
	Stmts         []ast.Stmt
	PNewLine      bool
	Token         ast.Token
	PrevTokenType int
	PendingGT     *ast.Token // Pending '>' from split '>>'
	// contains filtered or unexported fields
}

Lexer implements the yyLexer interface for goyacc.

func (*Lexer) Error

func (lx *Lexer) Error(message string)

func (*Lexer) Lex

func (lx *Lexer) Lex(lval *yySymType) int

func (*Lexer) TokenError

func (lx *Lexer) TokenError(tok ast.Token, message string)

type Scanner

type Scanner struct {
	Pos ast.Position
	// contains filtered or unexported fields
}

Scanner tokenizes Lua source code into a stream of tokens.

func NewScanner

func NewScanner(reader io.Reader, source string) *Scanner

NewScanner creates a scanner for the given input with the specified source name.

func (*Scanner) Error

func (sc *Scanner) Error(tok string, msg string) *Error

func (*Scanner) Newline

func (sc *Scanner) Newline(ch int)

func (*Scanner) Next

func (sc *Scanner) Next() int

func (*Scanner) Peek

func (sc *Scanner) Peek() int

func (*Scanner) Scan

func (sc *Scanner) Scan(lexer *Lexer) (ast.Token, error)

Scan reads and returns the next token from the input.

func (*Scanner) TokenError

func (sc *Scanner) TokenError(tok ast.Token, msg string) *Error

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL