tokenizer

package
v0.0.0-...-b7fb57f Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jun 19, 2025 License: MIT Imports: 7 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func IsLeadingIdentifierCharacter

func IsLeadingIdentifierCharacter(ch rune) bool

Types

type Tokenizer

type Tokenizer struct {
	// contains filtered or unexported fields
}

Tokenizer represents a lexical scanner.

This implements its own internal rune buffer in order to support both read() and unread(). It's not possible to use bufio's UnreadRune() because this can only unread a single rune, and Peek() clears the rune buffer.

func New

func New(src string) *Tokenizer

New returns a new lexer instance.

func (*Tokenizer) Pos

func (t *Tokenizer) Pos() int

func (*Tokenizer) Scan

func (t *Tokenizer) Scan() (ast.Token, string, int)

Scan returns the next token and literal value.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL