Documentation
¶
Overview ¶
Package tokenizer provides local token counting for Gemini models. This tokenizer downloads its model from the web, but otherwise doesn't require an API call for every [CountTokens] invocation.
Index ¶
Examples ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type CountTokensResponse ¶
type CountTokensResponse struct {
TotalTokens int32
}
CountTokensResponse is the response of Tokenizer.CountTokens.
type Tokenizer ¶
type Tokenizer struct {
// contains filtered or unexported fields
}
Tokenizer is a local tokenizer for text.
func New ¶
New creates a new Tokenizer from a model name; the model name is the same as you would pass to a genai.Client.GenerativeModel.
func (*Tokenizer) CountTokens ¶
func (tok *Tokenizer) CountTokens(parts ...genai.Part) (*CountTokensResponse, error)
CountTokens counts the tokens in all the given parts and returns their sum. Only genai.Text parts are suppored; an error will be returned if non-text parts are provided.
Example ¶
package main
import (
"fmt"
"log"
"cloud.google.com/go/vertexai/genai"
"cloud.google.com/go/vertexai/genai/tokenizer"
)
func main() {
tok, err := tokenizer.New("gemini-1.5-flash")
if err != nil {
log.Fatal(err)
}
ntoks, err := tok.CountTokens(genai.Text("a prompt"), genai.Text("another prompt"))
if err != nil {
log.Fatal(err)
}
fmt.Println("total token count:", ntoks.TotalTokens)
}
Output: total token count: 4
Click to show internal directories.
Click to hide internal directories.