Documentation
¶
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
Types ¶
type StateStack ¶
type StateStack []State
type TokenCollection ¶
type TokenCollection []Token
Since we only care about text content, tokenized results are represented as a flat collection.
func (*TokenCollection) Stringify ¶
func (t *TokenCollection) Stringify() (string, error)
Used for testing purposes
type TokenGrammar ¶
var TEXT_TOKEN TokenGrammar = TokenGrammar{Token: text_token, State: text}
type Tokenizer ¶
type Tokenizer struct {
// contains filtered or unexported fields
}
func NewTokenizer ¶
func (*Tokenizer) GetTokenType ¶
func (t *Tokenizer) GetTokenType() TokenGrammar
func (*Tokenizer) Tokenize ¶
func (t *Tokenizer) Tokenize(options TokenizerOptions) *Tokenizer
FIXME: this can be cleaned up, let's make it leaner and easier to extend -- already started the improvements with the tokentype func, let's keep going
type TokenizerOptions ¶
type TokenizerOptions struct {
// contains filtered or unexported fields
}
Click to show internal directories.
Click to hide internal directories.