Documentation

Overview

    Package tokiponatokens is a wrapper to a Toki Poka tokenizer. I have an instance set up here: https://us-central1-golden-cove-408.cloudfunctions.net/function-1

    Index

    Constants

    View Source
    const (
    	// Who/what the sentence is addressed to in Parts.
    	PartAddress      = `address`
    	PartSubject      = `subject`
    	PartObjectMarker = `objectMarker`
    	PartVerbMarker   = `verbMarker`
    	PartPrepPhrase   = `prepPhrase`
    	PartInterjection = `interjection`
    	// A foreign name.
    	PartCartouche = `cartouche`
    	// Most sentences will end in this.
    	PartPunctuation = `punctuation`
    )

      Individual part type values.

      View Source
      const (
      	PunctPeriod      = `period`
      	PunctQuestion    = `question`
      	PunctExclamation = `exclamation`
      	PunctComma       = `comma`
      )

        Punctuation constants.

        Variables

        This section is empty.

        Functions

        This section is empty.

        Types

        type Part

        type Part struct {
        	Type   string   `json:"part"`
        	Sep    *string  `json:"sep"`
        	Tokens []string `json:"tokens"`
        	Parts  []*Part  `json:"parts"`
        }

          Part is an individual part of a sentence.

          func (Part) Braces

          func (p Part) Braces() string

          type Sentence

          type Sentence []Part

            Sentence is a series of sentence parts. This correlates to one Toki Pona sentence.

            func Tokenize

            func Tokenize(aurl, text string) ([]Sentence, error)

              Tokenize returns a series of toki pona tokens.