lexer

package
v2.0.2+incompatible Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 22, 2018 License: MIT Imports: 5 Imported by: 0

Documentation

Overview

Package lexer provides a handlebars tokenizer.

Example

package example

source := "You know {{nothing}} John Snow"

output := ""

lex := Scan(source)
for {
	// consume next token
	token := lex.NextToken()

	output += fmt.Sprintf(" %s", token)

	// stops when all tokens have been consumed, or on error
	if token.Kind == TokenEOF || token.Kind == TokenError {
		break
	}
}

fmt.Print(output)
Output:

Content{"You know "} Open{"{{"} ID{"nothing"} Close{"}}"} Content{" John Snow"} EOF

Index

Examples

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Lexer

type Lexer struct {
	// contains filtered or unexported fields
}

Lexer is a lexical analyzer.

func Scan

func Scan(input string) *Lexer

Scan scans given input.

Tokens can then be fetched sequentially thanks to NextToken() function on returned lexer.

func (*Lexer) NextToken

func (l *Lexer) NextToken() Token

NextToken returns the next scanned token.

type Token

type Token struct {
	Kind TokenKind // Token kind
	Val  string    // Token value

	Pos  int // Byte position in input string
	Line int // Line number in input string
}

Token represents a scanned token.

func Collect

func Collect(input string) []Token

Collect scans and collect all tokens.

This should be used for debugging purpose only. You should use Scan() and lexer.NextToken() functions instead.

func (Token) String

func (t Token) String() string

String returns the token string representation for debugging.

type TokenKind

type TokenKind int

TokenKind represents a Token type.

const (
	// TokenError represents an error
	TokenError TokenKind = iota

	// TokenEOF represents an End Of File
	TokenEOF

	// TokenOpen is the OPEN token
	TokenOpen

	// TokenClose is the CLOSE token
	TokenClose

	// TokenOpenRawBlock is the OPEN_RAW_BLOCK token
	TokenOpenRawBlock

	// TokenCloseRawBlock is the CLOSE_RAW_BLOCK token
	TokenCloseRawBlock

	// TokenOpenEndRawBlock is the END_RAW_BLOCK token
	TokenOpenEndRawBlock

	// TokenOpenUnescaped is the OPEN_UNESCAPED token
	TokenOpenUnescaped

	// TokenCloseUnescaped is the CLOSE_UNESCAPED token
	TokenCloseUnescaped

	// TokenOpenBlock is the OPEN_BLOCK token
	TokenOpenBlock

	// TokenOpenEndBlock is the OPEN_ENDBLOCK token
	TokenOpenEndBlock

	// TokenInverse is the INVERSE token
	TokenInverse

	// TokenOpenInverse is the OPEN_INVERSE token
	TokenOpenInverse

	// TokenOpenInverseChain is the OPEN_INVERSE_CHAIN token
	TokenOpenInverseChain

	// TokenOpenPartial is the OPEN_PARTIAL token
	TokenOpenPartial

	// TokenComment is the COMMENT token
	TokenComment

	// TokenOpenSexpr is the OPEN_SEXPR token
	TokenOpenSexpr

	// TokenCloseSexpr is the CLOSE_SEXPR token
	TokenCloseSexpr

	// TokenEquals is the EQUALS token
	TokenEquals

	// TokenData is the DATA token
	TokenData

	// TokenSep is the SEP token
	TokenSep

	// TokenOpenBlockParams is the OPEN_BLOCK_PARAMS token
	TokenOpenBlockParams

	// TokenCloseBlockParams is the CLOSE_BLOCK_PARAMS token
	TokenCloseBlockParams

	// TokenContent is the CONTENT token
	TokenContent

	// TokenID is the ID token
	TokenID

	// TokenString is the STRING token
	TokenString

	// TokenNumber is the NUMBER token
	TokenNumber

	// TokenBoolean is the BOOLEAN token
	TokenBoolean
)

func (TokenKind) String

func (k TokenKind) String() string

String returns the token kind string representation for debugging.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL