machines

package
Version: v0.2.2 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 27, 2018 License: BSD-3-Clause Imports: 4 Imported by: 19

Documentation

Overview

Package machines implements the lexing algorithms.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type DFAAccepting

type DFAAccepting map[int]int

DFAAccepting represents maps from accepting DFA states to match identifiers. These both identify which states are accepting states and which matches they belong to from the AST.

type DFATrans

type DFATrans [][256]int

DFATrans represents a Deterministic Finite Automatons state transition table

type EmptyMatchError

type EmptyMatchError struct {
	TC      int
	Line    int
	Column  int
	MatchID int
}

EmptyMatchError is returned when a pattern would have matched the empty string

func (*EmptyMatchError) Error

func (e *EmptyMatchError) Error() string

type Match

type Match struct {
	PC          int
	TC          int
	StartLine   int
	StartColumn int
	EndLine     int
	EndColumn   int
	Bytes       []byte // the actual bytes matched during scanning.
}

A Match represents the positional and textual information from a match.

func (*Match) Equals

func (m *Match) Equals(other *Match) bool

Equals checks two matches for equality

func (Match) String

func (m Match) String() string

String formats the match for humans

type Scanner

type Scanner func(int) (int, *Match, error, Scanner)

Scanner is a functional iterator returned by the LexerEngine. See http://hackthology.com/functional-iteration-in-go.html

func DFALexerEngine

func DFALexerEngine(startState, errorState int, trans DFATrans, accepting DFAAccepting, text []byte) Scanner

DFALexerEngine does the actual tokenization of the byte slice text using the DFA state machine. If the lexing process fails the Scanner will return an UnconsumedInput error.

func LexerEngine

func LexerEngine(program inst.Slice, text []byte) Scanner

LexerEngine does the actual tokenization of the byte slice text using the NFA bytecode in program. If the lexing process fails the Scanner will return an UnconsumedInput error.

type UnconsumedInput

type UnconsumedInput struct {
	StartTC     int
	FailTC      int
	StartLine   int
	StartColumn int
	FailLine    int
	FailColumn  int
	Text        []byte
}

UnconsumedInput error type

func (*UnconsumedInput) Error

func (u *UnconsumedInput) Error() string

Error implements the error interface

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL