Documentation ¶
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
Types ¶
type TokenStream ¶
type TokenStream struct {
// contains filtered or unexported fields
}
TokenStream is a read-only queue of Token values. The next Token in the stream can be consumed by calling Next(). The next token can be inspected without being consumed by calling Peek(). By default, TokenStream discards whitespace characters as though they are not part of the stream. Use Raw get a TokenStream that respects whitespace.
func NewTokenStream ¶
func NewTokenStream(input []byte) *TokenStream
NewTokenStream creates a stream of Token values read from input.
func (*TokenStream) ConcatUntil ¶
func (s *TokenStream) ConcatUntil(anyOf ...TokenType) string
ConcatUntil concatenates the values of the next tokens in this stream as long as their types are not anyOf. TypeEOF is implied and need not be specified. Returns the concatenated output with outer spaces trimmed.
func (*TokenStream) Eat ¶
func (s *TokenStream) Eat(typ TokenType) error
Eat consumes the next token from the stream iff its type matches typ. If the types are different, an error is returned.
func (*TokenStream) Next ¶
func (s *TokenStream) Next() Token
Next consumes the next token in the stream.
func (*TokenStream) Peek ¶
func (s *TokenStream) Peek() Token
Peek returns a read-only copy of the next token in the stream, without consuming it.
func (*TokenStream) Raw ¶
func (s *TokenStream) Raw() *TokenStream
Raw returns a TokenStream that includes whitespace characters.
type TokenType ¶
type TokenType string
TokenType describes the category a Token belongs to.
const ( TypePound TokenType = "POUND" // '#' TypeNumber TokenType = "NUMBER" // A number TypeText TokenType = "TEXT" // Catch-all type TypeDot TokenType = "DOT" // '.' TypeNewline TokenType = "NEWLINE" // '\n' TypeEOF TokenType = "EOF" // Pseudo token to signal the end of input. TypeSpace TokenType = "SPACE" // A whitespace character TypeDash TokenType = "DASH" // '-' )
TokenType values recognized by the lexer.