language

package
v0.0.0-...-72fbe86 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Nov 17, 2020 License: MIT Imports: 9 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

View Source
var TokenKindNames = map[TokenKind]string{
	TokenKindIllegal:        "Illegal",
	TokenKindEOF:            "EOF",
	TokenKindPunctuator:     "Punctuator",
	TokenKindName:           "Name",
	TokenKindIntValue:       "IntValue",
	TokenKindFloatValue:     "FloatValue",
	TokenKindStringValue:    "StringValue",
	TokenKindUnicodeBOM:     "UnicodeBOM",
	TokenKindWhiteSpace:     "WhiteSpace",
	TokenKindLineTerminator: "LineTerminator",
	TokenKindComment:        "Comment",
	TokenKindComma:          "Comma",
}

TokenKindNames is a map of token types to their names as strings.

Functions

This section is empty.

Types

type Lexer

type Lexer struct {
	// contains filtered or unexported fields
}

Lexer holds the state of a state machine for lexically analysing GraphQL queries.

func NewLexer

func NewLexer(input []byte) *Lexer

NewLexer returns a new lexer, for lexically analysing GraphQL queries from a given reader.

func (*Lexer) Scan

func (l *Lexer) Scan() Token

Scan attempts to read the next significant token from the input. Tokens that are not understood will yield an "illegal" token.

type Parser

type Parser struct {
	// contains filtered or unexported fields
}

Parser is a parser for GraphQL documents.

func NewParser

func NewParser(input []byte) *Parser

NewParser returns a new Parser instance.

func (*Parser) Parse

func (p *Parser) Parse() (ast.Document, error)

Parse loops over the lexically analysed tokens produced by the lexer from the raw bytes of input and parses them into an AST of the GraphQL Document which it returns.

type Token

type Token struct {
	Kind    TokenKind // The token type.
	Literal string    // The literal value consumed.
	Line    int       // The line number at the start of this item.
	Column  int       // The starting position, in runes, of this token on this line.
}

Token represents a small, easily categorisable data structure that is fed to the parser to produce the abstract syntax tree (AST).

type TokenKind

type TokenKind int

TokenKind represents a type of token. The types are predefined as constants.

const (
	TokenKindIllegal TokenKind = iota - 1
	TokenKindEOF

	// Lexical Tokens.
	TokenKindPunctuator
	TokenKindName
	TokenKindIntValue
	TokenKindFloatValue
	TokenKindStringValue

	// Ignored tokens.
	TokenKindUnicodeBOM
	TokenKindWhiteSpace
	TokenKindLineTerminator
	TokenKindComment
	TokenKindComma
)

All different lexical token types.

func (TokenKind) String

func (t TokenKind) String() string

String returns the name of this type.

Directories

Path Synopsis

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL