tokenizer

package
v0.0.0-...-6f0ee80 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: May 25, 2016 License: MPL-2.0 Imports: 5 Imported by: 0

Documentation

Index

Constants

View Source
const (
	LPAREN = iota
	RPAREN
	LBRACE
	RBRACE
	COMMA
	DOT
	EQUAL
	PLUS
	STRING
	REGEXP
	POS
	GVAR
	LVAR
	KWD
	ID
	NAMESPACE
	OPEN
	FUNC
	TYPE
	PATH
	IMPORT
	OPTIONAL
	READ
	EOF
	ERROR
	NUM_LEXEMES
)

Variables

View Source
var LexemeName [NUM_LEXEMES]string

Functions

This section is empty.

Types

type Lexeme

type Lexeme int

Type tags so we know what kind of token we have

type Token

type Token struct {
	Lexeme
	Value      string
	ExtraValue string
	LineNumber int32
}

A token has a type (aka lexeme), a value, and a line number

func (*Token) Inspect

func (t *Token) Inspect() string

type Tokenizer

type Tokenizer struct {
	Source     []byte
	LineNumber int32
	Lookahead  *Token
	// contains filtered or unexported fields
}

Represent a tokenizer with a struct containing the remaining source text and the line number. Easier than using a stateless tokenizing function that returns them as extra values and requires the parser to keep track of them.

func MakeTokenizer

func MakeTokenizer(src []byte) *Tokenizer

func (*Tokenizer) Peek

func (t *Tokenizer) Peek() *Token

func (*Tokenizer) Pop

func (t *Tokenizer) Pop() *Token

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL