css_lexer

package
v0.15.1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Aug 10, 2022 License: MIT Imports: 4 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func IsNameContinue added in v0.7.20

func IsNameContinue(c rune) bool

func IsNameStart added in v0.7.20

func IsNameStart(c rune) bool

func WouldStartIdentifierWithoutEscapes added in v0.7.20

func WouldStartIdentifierWithoutEscapes(text string) bool

Types

type Comment added in v0.13.10

type Comment struct {
	Text            string
	Loc             logger.Loc
	TokenIndexAfter uint32
}

type T

type T uint8
const (
	TEndOfFile T = iota

	TAtKeyword
	TBadString
	TBadURL
	TCDC // "-->"
	TCDO // "<!--"
	TCloseBrace
	TCloseBracket
	TCloseParen
	TColon
	TComma
	TDelim
	TDelimAmpersand
	TDelimAsterisk
	TDelimBar
	TDelimCaret
	TDelimDollar
	TDelimDot
	TDelimEquals
	TDelimExclamation
	TDelimGreaterThan
	TDelimMinus
	TDelimPlus
	TDelimSlash
	TDelimTilde
	TDimension
	TFunction
	THash
	TIdent
	TNumber
	TOpenBrace
	TOpenBracket
	TOpenParen
	TPercentage
	TSemicolon
	TString
	TURL
	TWhitespace
)

func (T) IsNumeric added in v0.12.15

func (t T) IsNumeric() bool

func (T) String

func (t T) String() string

type Token

type Token struct {
	Range      logger.Range // 8 bytes
	UnitOffset uint16       // 2 bytes
	Kind       T            // 1 byte
	Flags      TokenFlags   // 1 byte
}

This token struct is designed to be memory-efficient. It just references a range in the input file instead of directly containing the substring of text since a range takes up less memory than a string.

func (Token) DecodedText added in v0.7.20

func (token Token) DecodedText(contents string) string

type TokenFlags added in v0.14.15

type TokenFlags uint8
const (
	IsID TokenFlags = 1 << iota
	DidWarnAboutSingleLineComment
)

type TokenizeResult added in v0.12.20

type TokenizeResult struct {
	Tokens               []Token
	LegalComments        []Comment
	SourceMapComment     logger.Span
	ApproximateLineCount int32
}

func Tokenize

func Tokenize(log logger.Log, source logger.Source) TokenizeResult

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL