lexer

package
v1.0.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 27, 2020 License: BSD-3-Clause Imports: 9 Imported by: 1

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Lexer

type Lexer struct {
	// contains filtered or unexported fields
}

Lexer is used to build and store the token table

func MakeLexer

func MakeLexer(rules Rules) (*Lexer, error)

MakeLexer is the default constructor for lexer

func (*Lexer) Reset

func (l *Lexer) Reset()

Reset readys the lexer for tokenizing

func (*Lexer) Tokenize

func (l *Lexer) Tokenize(in io.Reader) (*TokenTable, error)

Tokenize tokenizes all the input in the io.Reader

type Rules

type Rules struct {
	Nodes        map[string]automaton.PreNode `json:"nodes"`
	TokenStrings []automaton.Token            `json:"tokens"`
	Tokens       map[automaton.Token]bool
}

Rules is used by the tokenizer to build the token table

func MakeRules

func MakeRules(filename string) (*Rules, error)

MakeRules loads rules from a JSON file to a Rules struct

type TokenTable

type TokenTable struct {
	Tokens   []automaton.Token
	Values   []string
	Lines    []uint
	LinePosI []uint
	LinePosE []uint
}

TokenTable stores information about tokens generated by Lexer

func (*TokenTable) String

func (tt *TokenTable) String() string

Convert TokenTable to string

type Tokenizer

type Tokenizer struct {
	Nodes map[string]*automaton.Node
}

Tokenizer is the main struct used to iterate the source and assign Tokens

func (*Tokenizer) LoadRules

func (t *Tokenizer) LoadRules(rules Rules) error

LoadRules transforms PreNodes into Nodes given a set of rules

Directories

Path Synopsis

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL