tokenize

package
v3.0.7 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Aug 4, 2019 License: MIT Imports: 6 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type DefaultJoin

type DefaultJoin struct {
	// contains filtered or unexported fields
}

DefaultJoin is the default joiner

func NewDefaultJoin

func NewDefaultJoin(sep string) *DefaultJoin

NewDefaultJoin returns a new default join

func (*DefaultJoin) Join

func (dj *DefaultJoin) Join(tok Tokenizer) error

Join joins strings in a string slice

func (*DefaultJoin) Tokens

func (dj *DefaultJoin) Tokens() []string

Tokens returns the joined tokens

type Joiner

type Joiner interface {
	Join(Tokenizer) error
	Tokens() []string
}

Joiner joines a tokenizer

type NLP

type NLP struct {
	// contains filtered or unexported fields
}

NLP tokenizes a text using NLP

func NewNLP

func NewNLP(credentialsFile, text string, entities []string) (*NLP, error)

NewNLP returns a new NLP instance

func (*NLP) TokenizeEntities

func (nlp *NLP) TokenizeEntities() ([][]string, error)

TokenizeEntities returns nested tokenized entities

func (*NLP) TokenizeText

func (nlp *NLP) TokenizeText() ([]string, error)

TokenizeText tokenizes a text

type Tokenizer

type Tokenizer interface {
	TokenizeText() ([]string, error)
	TokenizeEntities() ([][]string, error)
}

Tokenizer tokenizes a text

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL