Documentation
¶
Overview ¶
Contains the lexical analysis package for the interpreter.
Index ¶
Constants ¶
View Source
const EOF = -1
Used to represent the end of the input
View Source
const MAX_DEPTH_DEFAULT = 1000
Maximum depth of the lexer, if reached, the lexer will stop processing
Variables ¶
This section is empty.
Functions ¶
func IsAlphaNumeric ¶
Returns true if the rune is a letter or a number
func IsWhitespace ¶
Returns true if the rune is a whitespace character or a new line character
Types ¶
type Lexer ¶
type Lexer struct {
// contains filtered or unexported fields
}
A lexer is used to tokenize a string into a series of tokens
func NewLexer ¶
func NewLexer(manager *manager.ErrorManager) *Lexer
Create a new lexer that will tokenize the given input
func (*Lexer) SetInitialState ¶
Useful for parsing partial input
Click to show internal directories.
Click to hide internal directories.