Documentation ¶
Overview ¶
Package lexer implements the lexical analysis stage of the FSPL compiler. Its job is to convert text into a series of tokens (lexemes) which are then passed to other parts of the compiler to be interpreted into more complex data structures. The lexer is able to read in new tokens as needed instead of reading them in all at once.
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Lexer ¶
type Lexer interface { // Next returns the next token. If there are no more tokens, it returns // an EOF token. It only returns an error on EOF if the file terminated // unexpectedly. Next() (Token, error) }
Lexer is an object capable of producing tokens.
type Token ¶
type Token struct { Position errors.Position // The position of the token in its file Kind TokenKind // Which kind of token it is Value string // The token's value }
Token represents a single lexeme of an FSPL file.
type TokenKind ¶
type TokenKind int
TokenKind is an enumeration of all tokens the FSPL compiler recognizes.
const ( EOF TokenKind = -(iota + 1) // Name Rough regex-ish description Ident // [a-z][a-zA-Z0-9]* TypeIdent // [A-Z][a-zA-Z0-9]* Int // (0b|0x)?[0-9a-fA-F]+ Float // [0-9]*\.[0-9]+ String // \'.*\' Symbol // [~!@#$%^&*-_=+\\|;,<>/?]+ LParen // \( LBrace // \{ LBracket // \[ RParen // \) RBrace // \} RBracket // \] Colon // : DoubleColon // :: Dot // . DoubleDot // .. Star // \* )
Click to show internal directories.
Click to hide internal directories.