tokenizer

package
v0.15.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Jun 25, 2025 License: Apache-2.0, Apache-2.0 Imports: 10 Imported by: 1

Documentation

Overview

Package tokenizer provides local token counting for Gemini models. This tokenizer downloads its model from the web, but otherwise doesn't require an API call for every [CountTokens] invocation.

Index

Examples

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type CountTokensResponse

type CountTokensResponse struct {
	TotalTokens int32
}

CountTokensResponse is the response of Tokenizer.CountTokens.

type Tokenizer

type Tokenizer struct {
	// contains filtered or unexported fields
}

Tokenizer is a local tokenizer for text.

func New

func New(modelName string) (*Tokenizer, error)

New creates a new Tokenizer from a model name; the model name is the same as you would pass to a genai.Client.GenerativeModel.

func (*Tokenizer) CountTokens

func (tok *Tokenizer) CountTokens(parts ...genai.Part) (*CountTokensResponse, error)

CountTokens counts the tokens in all the given parts and returns their sum. Only genai.Text parts are suppored; an error will be returned if non-text parts are provided.

Example
package main

import (
	"fmt"
	"log"

	"cloud.google.com/go/vertexai/genai"
	"cloud.google.com/go/vertexai/genai/tokenizer"
)

func main() {
	tok, err := tokenizer.New("gemini-1.5-flash")
	if err != nil {
		log.Fatal(err)
	}

	ntoks, err := tok.CountTokens(genai.Text("a prompt"), genai.Text("another prompt"))
	if err != nil {
		log.Fatal(err)
	}

	fmt.Println("total token count:", ntoks.TotalTokens)

}
Output:

total token count: 4

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL