syntax

package module
v0.2.1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 31, 2026 License: MIT Imports: 8 Imported by: 0

README

syntax

General tokenizer and parser, written in Go

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Lexer

type Lexer struct {
	// contains filtered or unexported fields
}

func NewLexer

func NewLexer(path string) (*Lexer, error)

NewLexer creates a new Lexer with token types and patterns from the given file path

func NewLexerFrom added in v0.2.1

func NewLexerFrom(text string) (*Lexer, error)

NewLexerFrom creates a new Lexer with token types and patterns from the given text

func (*Lexer) Tokenize

func (lexer *Lexer) Tokenize(text string, ignoreTypes []string) ([]Token, error)

Tokenize tokenizes the given string and returns the list of Tokens. We can pass in a list of TokenTypes to ignore (pass nil if nothing to ignore).

type Parser

type Parser struct {
	// contains filtered or unexported fields
}

func NewParser

func NewParser(path string) (*Parser, error)

NewParser creates a new Parser from the given file path

func NewParserFrom added in v0.2.1

func NewParserFrom(text string) (*Parser, error)

NewParserFrom creates a new Parser from the given text

func (*Parser) CheckSyntax

func (parser *Parser) CheckSyntax(text string, ignoreTypes []string) error

type Sentence

type Sentence = []string

type Terminal

type Terminal = string

type Token

type Token struct {
	Type     string
	Text     string
	Row      int
	ColStart int
	ColEnd   int
}

func (Token) Coords

func (t Token) Coords() string

Coords returns the (Row: ColStart-ColEnd) as string

func (Token) String

func (t Token) String() string

String returns the string representation of Token

type Variable

type Variable = string

Directories

Path Synopsis

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL