Documentation
¶
Overview ¶
This file is generated by gen_tokenizers.go. DO NOT EDIT.
This file is generated by gen_tokenizers.go. DO NOT EDIT.
This file is generated by gen_tokenizers.go. DO NOT EDIT.
This file is generated by gen_tokenizers.go. DO NOT EDIT.
This file is generated by gen_tokenizers.go. DO NOT EDIT.
This file is generated by gen_tokenizers.go. DO NOT EDIT.
This file is generated by gen_tokenizers.go. DO NOT EDIT.
Index ¶
Constants ¶
const ( LanguagePlaintext = Language(iota) LanguageJson LanguageYaml LanguageGo LanguageGitCommit LanguageGitRebase LanguageDevlog )
Variables ¶
var AllLanguages = []Language{ LanguagePlaintext, LanguageJson, LanguageYaml, LanguageGo, LanguageGitCommit, LanguageGitRebase, LanguageDevlog, }
var DevlogTokenizer *parser.Tokenizer
var GitCommitTokenizer *parser.Tokenizer
var GitRebaseTokenizer *parser.Tokenizer
var GoTokenizer *parser.Tokenizer
var JsonTokenizer *parser.Tokenizer
var YamlTokenizer *parser.Tokenizer
Functions ¶
func TokenizeString ¶
TokenizeString tokenizes a string based on the specified language.
func TokenizerForLanguage ¶
TokenizerForLanguage returns a tokenizer for the specified language. If no tokenizer is available (e.g. for LanguagePlaintext) this returns nil.
Types ¶
type Language ¶
type Language int
Language is an enum of available languages that we can parse.
func LanguageFromString ¶
type TokenWithText ¶
TokenWithText is a token that includes its text value.
func ParseTokensWithText ¶
func ParseTokensWithText(language Language, s string) ([]TokenWithText, error)
ParseTokensWithText tokenizes the input string using the specified language. This is useful for testing tokenizer rules.