tokenizer

package
v1.0.42 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Oct 3, 2021 License: Apache-2.0 Imports: 9 Imported by: 0

README

tokenizer package

This package parses SQL statements to do two specific things:

  • Returns tokens for SQL statements, where those tokens identify keyword, type, name and value text within the SQL statement;
  • Can determine if a statement "is complete - that is, has a trailing semicolon on a statement.

This package is part of a wider project, github.com/mutablelogic/go-sqlite. Please see the module documentation for more information.

Using the tokenizer

Here's an example of using the tokenizer:

import (
	"github.com/mutablelogic/go-sqlite/pkg/tokenizer"
)

func Tokenize(q string) ([]interface{},error) {
    tokenizer := NewTokenizer(test)
    tokens := []interface{}{}
    for {
        token, err := tokenizer.Next()
        if token == nil {
            return tokens, nil
        }
        if err != nil {
            return nil, err
        }
        tokens = append(tokens, token)
    }
}

Tokens returned can be one of the following types:

  • KeywordToken: a keyword, such as SELECT, FROM, WHERE, etc.
  • TypeToken: a type such as INTEGER, TEXT, etc
  • NameToken: a table or column name
  • Value Token: a numeric, boolean or text value
  • WhitespaceToken: Spaces, tabs and newlines
  • PuncuationToken: anything not included above

Establishing if a statement is complete

Call the func IsComplete(string) bool method to determine if a statement is complete. As per the sqlite documentation "useful during command-line input to determine if the currently entered text seems to form a complete SQL statement or if additional input is needed before sending the text into SQLite for parsing".

However, "...do not parse the SQL statements thus will not detect syntactically incorrect SQL."

Documentation

Overview

Package tokenizer provides an SQL statement parser

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func IsComplete

func IsComplete(v string) bool

IsComplete returns true if the input string appears to be a complete SQL statement

Types

type KeywordToken

type KeywordToken string // An SQL reserved keyword

type NameToken

type NameToken string // A table or column identifier

type PuncuationToken

type PuncuationToken string // A punctuation character

type Tokenizer

type Tokenizer struct {
	*bufio.Scanner
}

A tokenizer that scans the input SQL statement

func NewTokenizer

func NewTokenizer(v string) *Tokenizer

NewTokenizer returns a new Tokenizer that scans the input SQL statement

func (*Tokenizer) Next

func (t *Tokenizer) Next() (interface{}, error)

Next returns the next token in the input stream, or returns io.EOF error and nil if there are no more tokens to comsume.

type TypeToken

type TypeToken string // An SQL data type

type ValueToken

type ValueToken string // A value literal

type WhitespaceToken

type WhitespaceToken string // Whitespace token

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL