lexer

package
v1.74.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: May 3, 2026 License: MIT Imports: 3 Imported by: 0

Documentation

Overview

Package lexer tokenizes MX Script source code into a stream of tokens that the parser can consume. It tracks line and column numbers so error messages can point at the exact location of a problem.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Lexer

type Lexer struct {

	// CollectComments controls whether // / # / /* ... */ are kept as
	// TokenComment in the output, and whether \n is emitted as TokenNewline.
	// Defaults to false (the parser doesn't want them); the formatter sets
	// it to true so it can preserve comments and blank lines.
	CollectComments bool
	// contains filtered or unexported fields
}

func New

func New(src string) *Lexer

func NewWithComments added in v0.20.0

func NewWithComments(src string) *Lexer

NewWithComments returns a lexer that retains comment tokens for the formatter (and any future tooling that wants source fidelity).

func (*Lexer) Tokenize

func (l *Lexer) Tokenize() ([]Token, error)

Tokenize runs the lexer over the source and returns the full token stream. It always appends a trailing TokenEOF.

type Token

type Token struct {
	Type   TokenType
	Lexeme string
	Line   int
	Col    int
}

func (Token) String

func (t Token) String() string

type TokenType

type TokenType int
const (
	TokenEOF TokenType = iota
	TokenIllegal

	TokenIdent
	TokenNumber
	TokenString

	TokenLet
	TokenFn
	TokenReturn
	TokenIf
	TokenElse
	TokenLoop
	TokenAs
	TokenRoute
	TokenServer
	TokenMiddleware
	TokenUse
	TokenTry
	TokenCatch
	TokenTrue
	TokenFalse
	TokenNull
	TokenImport
	TokenExport
	TokenWhile
	TokenBreak
	TokenContinue
	TokenStatic
	TokenMatch
	TokenFatArrow
	TokenSpawn
	TokenGroup

	TokenLBrace
	TokenRBrace
	TokenLParen
	TokenRParen
	TokenLBracket
	TokenRBracket
	TokenColon
	TokenComma
	TokenDot
	TokenSemicolon
	TokenAssign
	TokenEq
	TokenNotEq
	TokenLT
	TokenGT
	TokenLTEq
	TokenGTEq
	TokenPlus
	TokenMinus
	TokenStar
	TokenSlash
	TokenPercent
	TokenBang
	TokenAnd
	TokenOr
	TokenPipe // |> — value forward into a function call

	// Compound assignment — `x += 1` desugars to `x = x + 1` at parse time.
	TokenPlusEq
	TokenMinusEq
	TokenStarEq
	TokenSlashEq
	TokenNullCoalesceAssign // ??= — set only when LHS is null

	// Range operators: 1..10 (exclusive end), 1..=10 (inclusive end).
	TokenRange
	TokenRangeEq
	TokenSpread
	TokenQuestionDot  // ?.
	TokenNullCoalesce // ??
	TokenComment      // // line, # line, or /* block */ — only emitted when CollectComments is true
	TokenNewline      // line break — only emitted when CollectComments is true
)

func (TokenType) String

func (t TokenType) String() string

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL