airouter

package module
v0.0.65 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 1, 2026 License: MIT Imports: 21 Imported by: 0

README

AI Router

Go Reference Go Report Card

AI Router is a lightweight Go client for the Subiz AI Proxy, providing a unified interface to multiple LLM providers including OpenAI, Google Gemini, and more.

Features

  • Unified API: Single interface for Chat Completion, Text Embedding, and Reranking across different providers.
  • Provider Agnostic: Switch between GPT-4, GPT-5, Gemini, and other models without changing your core logic.
  • Simplified Configuration: Easy initialization with a Subiz API key.

Installation

go get github.com/subiz/airouter

Getting Started

Initialization

Initialize the library with your Subiz API key:

import "github.com/subiz/airouter"

func init() {
    airouter.Init("YOUR_SUBIZ_API_KEY")
}
Usage Examples
1. Chat Completion
package main

import (
    "context"
    "fmt"
    "github.com/subiz/airouter"
)

func main() {
    airouter.Init("YOUR_SUBIZ_API_KEY")

    ctx := context.Background()
    output, _, err := airouter.Complete(ctx, airouter.CompletionInput{
        Model: airouter.Gpt_5_nano,
        ReasoningEffort: "low",
        Instruct: "Tell a short story about a brave dragon.",
    })

    if err != nil {
        panic(err)
    }

    fmt.Println(output) // The dragon’s name was Ember, and his light was not a blaze but a memory of dawn...
}
2. Text Embedding
package main

import (
    "context"
    "fmt"
    "github.com/subiz/airouter"
)

func main() {
    airouter.Init("YOUR_SUBIZ_API_KEY")

    vector, _, err := airouter.GetEmbedding(context.Background(), airouter.Text_embedding_3_small, "Hello world")
    if err != nil {
        panic(err)
    }

    fmt.Printf("Vector length: %d\n", len(vector))
    fmt.Println(vector[:5]) // Print first 5 dimensions
}
3. Rerank
package main

import (
    "context"
    "fmt"
    "github.com/subiz/airouter"
)

func main() {
    airouter.Init("YOUR_SUBIZ_API_KEY")

    records := []*airouter.RerankRecord{
        {Id: "1", Title: "Greeting", Content: "Xin chào thế giới"},
        {Id: "2", Title: "Farewell", Content: "Goodbye world"},
    }

    output, err := airouter.Rerank(context.Background(), "google", "hello", records)
    if err != nil {
        panic(err)
    }

    fmt.Println("Rerank results:", output)
}

Supported Models

Chat Completion
Provider Model Aliases
OpenAI gpt-4o, gpt-4o-mini, gpt-4.1, gpt-4.1-mini, gpt-4.1-nano, gpt-5, gpt-5-mini, gpt-5-nano
Google gemini-2.5-pro, gemini-2.5-flash, gemini-2.5-flash-lite, gemini-3.1-flash-lite-preview
Text Embedding
  • gemini-embedding-001
  • text-embedding-3-small
  • text-embedding-3-large
  • text-embedding-ada-002
Reranking
  • google (Powered by Google Vertex AI)

License

This project is licensed under the MIT License - see the LICENSE file for details.

Documentation

Index

Constants

View Source
const Gemini_2_5_flash = "gemini-2.5-flash"
View Source
const Gemini_2_5_flash_lite = "gemini-2.5-flash-lite"
View Source
const Gemini_2_5_pro = "gemini-2.5-pro"
View Source
const Gemini_3_1_flash_lite = "gemini-3.1-flash-lite-preview"
View Source
const Gemini_embedding_001 = "gemini-embedding-001"
View Source
const Gpt_4_1 = "gpt-4.1"
View Source
const Gpt_4_1_mini = "gpt-4.1-mini"
View Source
const Gpt_4_1_nano = "gpt-4.1-nano"
View Source
const Gpt_4o = "gpt-4o"
View Source
const Gpt_4o_mini = "gpt-4o-mini"
View Source
const Gpt_5 = "gpt-5"
View Source
const Gpt_5_4_mini = "gpt-5.4-mini"
View Source
const Gpt_5_4_nano = "gpt-5.4-nano"
View Source
const Gpt_5_mini = "gpt-5-mini"
View Source
const Gpt_5_nano = "gpt-5-nano"
View Source
const O3 = "o3"
View Source
const O4_mini = "o4-mini"
View Source
const Text_embedding_3_large = "text-embedding-3-large"
View Source
const Text_embedding_3_small = "text-embedding-3-small"
View Source
const Text_embedding_ada_002 = "text-embedding-ada-002"

Variables

View Source
var BACKEND = "https://api.subiz.com.vn/4.1/ai"

Functions

func CalculateCost

func CalculateCost(model string, usage *Usage, service_tier string) int64

return 1000 usd fpv

func CleanString

func CleanString(str string) string

badstring: UGjhuqduIGjGsOG7m25nIGThuqtuIGPDoGkgxJHhurd0IHThu7EgxJHhu5luZyBn4butaSBaTlMgdHLDqm4gU3ViaXouCiAzOkNo4buNbiBt4bqrdSB0aW4gWk5TW+KAi10oI2IlQzYlQjAlRTElQkIlOUJjLTMlRTElQkIlOERuLW0lRTElQkElQUJ1LXRpbi16bnMgIsSQxrDhu51uZyBk4bqrbiB0cuG7sWMgdGnhur9wIMSR4bq/biBixrDhu5tjLTPhu41uLW3huqt1LXRpbi16bnMiKQohW10oaHR0cHM6Ly92Y2RuLnN1Yml6LWNkbi5jb20vZmlsZS83ZjczNDllNWNjYTNmNzczYTlkODIwODIxZTg3ZmI1NTEwYzlhODgwNDc0MGQ0ZjMzOWY3YzlkMzdiN2IzNmZiX2FjcHhrZ3VtaWZ1b29mb29zYmxlKQojIyMgQsaw4bubYyA0OiBI4bq5biBs4buLY2ggZ+G7rWkgdOG7sSDEkeG7mW5nIFpOU1vigItdKCNiJUM2JUIwJUUxJUJCJTlCYy00LWglRTElQkElQjluLWwlRTElQkIlOEJjaC1nJUUxJUJCJUFEaS10JUUxJUJCJUIxLSVDNCU5MSVFMSVCQiU5OW5nLXpucyAixJDGsOG7nW5nIGThuqtuIHRy4buxYyB0aQDhur9w

func FakeBackend

func FakeBackend(rawURL string, input []byte) (int, []byte)

for testing only

func GetFallbackChatModel

func GetFallbackChatModel(model string) string

GetFallbackChatModel returns a fallback model for a given model. If the model is a GPT model, it returns the equivalent Gemini model. If the model is a Gemini model, it returns the equivalent OpenAI model. Otherwise, it returns a default Gemini model.

func GetMD5Hash

func GetMD5Hash(text string) string

func Init

func Init(apikey string)

Init setups the Subiz API Key client. Only client should call this

func InitAPI

func InitAPI(geminiKey, openaiKey string)

InitAPI setups API Keys for the server. Only server should call this

func ToGeminiModel

func ToGeminiModel(model string) string

ToGeminiModel converts a model name to its equivalent Gemini model.

func ToGeminiRequestJSON

func ToGeminiRequestJSON(req CompletionInput) ([]byte, error)

ToGeminiRequestJSON converts an OpenAIChatRequest to a Gemini-compatible JSON request string.

func ToModel

func ToModel(model string) string

ToModel converts model to the closest standardized model

func ToOpenAICompletionJSON

func ToOpenAICompletionJSON(req CompletionInput) ([]byte, error)

func ToOpenAIModel

func ToOpenAIModel(model string) string

ToOpenAIModel converts a model name to its equivalent OpenAI model.

Types

type CompletionInput

type CompletionInput struct {
	Seed                 int                           `json:"seed,omitempty"`
	PromptCacheKey       string                        `json:"prompt_cache_key,omitempty"`
	PromptCacheRetention string                        `json:"prompt_cache_retention,omitempty"`
	Verbosity            string                        `json:"verbosity,omitempty"`
	Stop                 []string                      `json:"stop,omitempty"`
	Model                string                        `json:"model,omitempty"`
	NoLog                bool                          `json:"-"` // disable log
	Instruct             string                        `json:"instruct,omitempty"`
	Messages             []*header.LLMChatHistoryEntry `json:"messages,omitempty"`
	MaxCompletionTokens  int                           `json:"max_completion_tokens,omitempty"`
	ResponseFormat       *ResponseFormat               `json:"response_format,omitempty"`
	ToolChoice           string                        `json:"tool_choice,omitempty"`
	Reasoning            *CompletionReasoning          `json:"reasoning,omitempty"`
	ReasoningEffort      string                        `json:"reasoning_effort,omitempty"`
	Temperature          float32                       `json:"temperature,omitempty"`
	TopP                 float32                       `json:"top_p,omitempty"`
	Tools                []OpenAITool                  `json:"tools,omitempty"`
	ServiceTier          string                        `json:"service_tier,omitempty"` // [auto, default], flex, priority, scale
	StopAfterToolCalled  bool                          `json:"stop_after_tool_called"`
}

type CompletionOutput

type CompletionOutput struct {
	Content           string `json:"content"`
	Refusal           string `json:"refusal"`
	Request           []byte `json:"request"`
	InputTokens       int64  `json:"input_tokens"`
	OutputTokens      int64  `json:"output_tokens"`
	InputCachedTokens int64  `json:"input_cached_tokens"`
	OuputCachedTokens int64  `json:"output_cached_tokens"`
	Created           int64  `json:"created"`
	DurationMs        int64  `json:"duration_ms"`
	KfpvCostUSD       int64  `json:"kfpv_cost_usd"` // 1 usd -> 1000_000_000 kfpvusd
}

func Complete

type CompletionReasoning

type CompletionReasoning struct {
	Effort    string `json:"effort,omitempty"` // high, medium, low
	MaxTokens int    `json:"max_tokens,omitempty"`
}

type Embedding

type Embedding struct {
	Values []float32 `json:"values"`
}

Embedding represents the embedding values.

type EmbeddingOutput

type EmbeddingOutput struct {
	Text        string    `json:"text"`
	Vector      []float32 `json:"vector"`
	TotalTokens int64     `json:"total_tokens"`
	Created     int64     `json:"created"`
	DurationMs  int64     `json:"duration_ms"`
	KfpvCostUSD int64     `json:"kfpv_cost_usd"`
}

func GetEmbedding

func GetEmbedding(ctx context.Context, model string, text string) ([]float32, EmbeddingOutput, error)

type Function

type Function struct {
	Name        string             `json:"name,omitempty"`
	Description string             `json:"description,omitempty"`
	Parameters  *header.JSONSchema `json:"parameters,omitempty"`

	Handler func(ctx context.Context, arg, callid string, ctxm map[string]any) string `json:"-"` // "", true -> abandon completion
}

Function mimics the structure of a function in an OpenAI Chat Completion request.

type GeminiAPIResponse

type GeminiAPIResponse struct {
	Candidates    []*LocalCandidate   `json:"candidates"`
	UsageMetadata *LocalUsageMetadata `json:"usageMetadata"`
	ModelVersion  string              `json:"modelVersion"`
	ResponseId    string              `json:"responseId"`
	Error         *GeminiError        `json:"error"`
}

GeminiAPIResponse is a wrapper for the Gemini API response.

type GeminiContent

type GeminiContent struct {
	Parts []*GeminiPart `json:"parts"`
	Role  string        `json:"role,omitempty"`
}

GeminiContent represents a content block in a Gemini request.

type GeminiEmbeddingContent

type GeminiEmbeddingContent struct {
	Parts []GeminiEmbeddingPart `json:"parts"`
}

GeminiEmbeddingContent represents the content for an embedding request.

type GeminiEmbeddingPart

type GeminiEmbeddingPart struct {
	Text string `json:"text"`
}

GeminiEmbeddingPart represents a part of the content for an embedding request.

type GeminiEmbeddingRequest

type GeminiEmbeddingRequest struct {
	Model   string                 `json:"model"`
	Content GeminiEmbeddingContent `json:"content"`
}

EmbeddingRequest represents the request to the embedding model.

type GeminiEmbeddingResponse

type GeminiEmbeddingResponse struct {
	Embedding *Embedding `json:"embedding"`
}

GeminiEmbeddingResponse represents the response from the embedding model.

type GeminiError

type GeminiError struct {
	Code    int                 `json:"code"`
	Message string              `json:"message"`
	Status  string              `json:"status"`
	Details []GeminiErrorDetail `json:"details"`
}

GeminiError represents a Gemini API error.

type GeminiErrorDetail

type GeminiErrorDetail struct {
	Type            string `json:"@type"`
	FieldViolations []struct {
		Field       string `json:"field"`
		Description string `json:"description"`
	} `json:"fieldViolations"`
}

GeminiErrorDetail represents the details of a Gemini API error.

type GeminiFunctionCall

type GeminiFunctionCall struct {
	Name string         `json:"name"`
	Args map[string]any `json:"args"`
}

GeminiFunctionCall is a local struct to avoid genai dependency.

type GeminiFunctionDeclaration

type GeminiFunctionDeclaration struct {
	Name        string             `json:"name"`
	Description string             `json:"description"`
	Parameters  *header.JSONSchema `json:"parameters"`
}

GeminiFunctionDeclaration is a local struct to avoid genai dependency.

type GeminiFunctionResponse

type GeminiFunctionResponse struct {
	Name     string         `json:"name"`
	Response map[string]any `json:"response"`
}

GeminiFunctionResponse is a local struct to avoid genai dependency.

type GeminiGenerationConfig

type GeminiGenerationConfig struct {
	StopSequences []string `json:"stopSequences,omitempty"`
	// CandidateCount   int                   `json:"candidateCount,omitempty"`
	MaxOutputTokens  int                   `json:"maxOutputTokens,omitempty"`
	Temperature      float32               `json:"temperature,omitempty"`
	TopP             float32               `json:"topP,omitempty"`
	Seed             int                   `json:"seed,omitempty"`
	ResponseMIMEType string                `json:"responseMimeType,omitempty"`
	ResponseSchema   *header.JSONSchema    `json:"responseSchema,omitempty"`
	ThinkingConfig   *GeminiThinkingConfig `json:"thinkingConfig,omitempty"`
}

GeminiGenerationConfig represents the generation configuration.

type GeminiPart

type GeminiPart struct {
	Text             *string                 `json:"text,omitempty"`
	FunctionCall     *GeminiFunctionCall     `json:"functionCall,omitempty"`
	FunctionResponse *GeminiFunctionResponse `json:"functionResponse,omitempty"`
	InlineData       *GeminiPartInlineData   `json:"inline_data,omitempty"`
}

GeminiPart represents a part of a content block.

type GeminiPartInlineData

type GeminiPartInlineData struct {
	MimeType string `json:"mime_type,omitempty"`
	Data     string `json:"data,omitempty"`
}

type GeminiRankingResponse

type GeminiRankingResponse struct {
	Records []*GeminiRerankRecord `json:"records,omitempty"`
	Error   *GeminiError          `json:"error"`
}

type GeminiRequest

type GeminiRequest struct {
	Model             string                  `json:"model"`
	SystemInstruction *GeminiContent          `json:"systemInstruction,omitempty"`
	Contents          []*GeminiContent        `json:"contents"`
	Tools             []*GeminiTool           `json:"tools,omitempty"`
	GenerationConfig  *GeminiGenerationConfig `json:"generationConfig,omitempty"`
}

GeminiRequest represents the structure of a request to the Gemini API. https://ai.google.dev/api/generate-content#v1beta.GenerationConfig

type GeminiRerankRecord

type GeminiRerankRecord struct {
	Id      string  `json:"id,omitempty"`
	Title   string  `json:"title,omitempty"`
	Content string  `json:"content,omitempty"`
	Score   float32 `json:"score,omitempty"`
}

type GeminiThinkingConfig

type GeminiThinkingConfig struct {
	// The number of thoughts tokens that the model should generate.
	ThinkingBudget int `json:"thinkingBudget,omitempty"`

	// Indicates whether to include thoughts in the response. If true, thoughts are returned only when available.
	IncludeThoughts bool `json:"includeThoughts"`
}

https://ai.google.dev/api/generate-content#ThinkingConfig

type GeminiTool

type GeminiTool struct {
	FunctionDeclarations []*GeminiFunctionDeclaration `json:"functionDeclarations"`
}

GeminiTool is a local struct to avoid genai dependency.

type LocalCandidate

type LocalCandidate struct {
	Content      *LocalContent `json:"content"`
	FinishReason string        `json:"finishReason"` // Using string instead of enum
}

LocalCandidate mirrors genai.Candidate

type LocalContent

type LocalContent struct {
	Parts []*LocalPart `json:"parts"`
	Role  string       `json:"role"`
}

LocalContent mirrors genai.Content

type LocalPart

type LocalPart struct {
	Text         *string             `json:"text,omitempty"`
	FunctionCall *GeminiFunctionCall `json:"functionCall,omitempty"`
}

LocalPart is a struct to unmarshal different part types from JSON.

type LocalUsageMetadata

type LocalUsageMetadata struct {
	PromptTokenCount        int32 `json:"promptTokenCount,omitempty"`
	CandidatesTokenCount    int32 `json:"candidatesTokenCount,omitempty"`
	TotalTokenCount         int32 `json:"totalTokenCount,omitempty"`
	CachedContentTokenCount int32 `json:"cachedContentTokenCount,omitempty"`
	ThoughtsTokenCount      int32 `json:"thoughtsTokenCount,omitempty"`
}

LocalUsageMetadata mirrors genai.UsageMetadata

type OpenAIChatMessage

type OpenAIChatMessage struct {
	Role       string                 `json:"role"`
	Content    *string                `json:"content"` // Use pointer to allow for null content
	Name       string                 `json:"name,omitempty"`
	Contents   []OpenAIMessageContent `json:"contents,omitempty"` // Use pointer to allow for null content
	ToolCalls  []ToolCall             `json:"tool_calls,omitempty"`
	ToolCallId string                 `json:"tool_call_id,omitempty"`
	Refusal    string                 `json:"refusal,omitempty"`
}

OpenAIChatMessage mimics the structure of a message in an OpenAI Chat Completion request.

func (*OpenAIChatMessage) GetContent

func (m *OpenAIChatMessage) GetContent() string

type OpenAIChatResponse

type OpenAIChatResponse struct {
	ID          string         `json:"id,omitempty"`
	Created     int64          `json:"created,omitempty"`
	Object      string         `json:"object,omitempty"`
	Model       string         `json:"model,omitempty"`
	Choices     []OpenAIChoice `json:"choices,omitempty"`
	Usage       *Usage         `json:"usage,omitempty"`
	Error       *OpenAIError   `json:"error,omitempty"`
	ServiceTier string         `json:"service_tier,omitempty"`
}

OpenAIChatResponse mimics the structure of an OpenAI Chat Completion response.

func ChatCompleteAPI

func ChatCompleteAPI(ctx context.Context, payload []byte) (OpenAIChatResponse, error)

type OpenAIChoice

type OpenAIChoice struct {
	Index        int               `json:"index"`
	Message      OpenAIChatMessage `json:"message"`
	FinishReason string            `json:"finish_reason"`
}

OpenAIChoice mimics the structure of a choice in an OpenAI Chat Completion response.

type OpenAIEmbeddingData

type OpenAIEmbeddingData struct {
	Object    string    `json:"object"`
	Embedding []float32 `json:"embedding"`
	Index     int       `json:"index"`
}

type OpenAIEmbeddingRequest

type OpenAIEmbeddingRequest struct {
	Input string `json:"input"`
	Model string `json:"model"`
}

type OpenAIEmbeddingResponse

type OpenAIEmbeddingResponse struct {
	Object string                `json:"object"`
	Data   []OpenAIEmbeddingData `json:"data"`
	Model  string                `json:"model"`
	Usage  *Usage                `json:"usage"`
	Error  *OpenAIError          `json:"error,omitempty"`
}

func GetEmbeddingAPI

func GetEmbeddingAPI(ctx context.Context, model, text string) (OpenAIEmbeddingResponse, error)

type OpenAIError

type OpenAIError struct {
	Message string `json:"message"`
	Type    string `json:"type"`
	Param   string `json:"param"`
	Code    string `json:"code,omitempty"`
}

OpenAIError represents the structure of an error in an OpenAI Chat Completion response.

type OpenAIJSONSchema

type OpenAIJSONSchema struct {
	Title                string                       `json:"title,omitempty"`
	Type                 string                       `json:"type,omitempty"` // string, number, object, array, boolean, null
	Description          string                       `protobuf:"bytes,5,opt,name=description,proto3" json:"description,omitempty"`
	Properties           map[string]*OpenAIJSONSchema ``                                                                /* 146-byte string literal not displayed */
	Items                *OpenAIJSONSchema            `protobuf:"bytes,7,opt,name=items,proto3" json:"items,omitempty"` // used for type array
	MinItems             int64                        `protobuf:"varint,8,opt,name=minItems,proto3" json:"minItems,omitempty"`
	UniqueItems          bool                         `protobuf:"varint,9,opt,name=uniqueItems,proto3" json:"uniqueItems,omitempty"`
	ExclusiveMinimum     int64                        `protobuf:"varint,10,opt,name=exclusiveMinimum,proto3" json:"exclusiveMinimum,omitempty"`
	Required             []string                     `protobuf:"bytes,11,rep,name=required,proto3" json:"required,omitempty"`
	AdditionalProperties bool                         `json:"additionalProperties"` // chatgpt required this
	Enum                 []string                     `json:"enum,omitempty"`
}

type OpenAIMessageContent

type OpenAIMessageContent struct {
	Text     string                        `json:"text,omitempty"`
	Type     string                        `json:"type,omitempty"` // text, image_url
	ImageUrl *OpenAIMessageContentImageUrl `json:"image_url,omitempty"`
}

type OpenAIMessageContentImageUrl

type OpenAIMessageContentImageUrl struct {
	Url string `json:"url,omitempty"`
}

type OpenAITool

type OpenAITool struct {
	Type     string    `json:"type"`
	Function *Function `json:"function"`
	Output   string    `json:"output,omitempty"` // our fake
}

OpenAITool mimics the structure of a tool in an OpenAI Chat Completion request.

type PromptTokensDetails

type PromptTokensDetails struct {
	CachedTokens int64 `json:"cached_tokens"`
}

PromptTokensDetails mimics the structure of the prompt_tokens_details field.

type RerankInput

type RerankInput struct {
	Seed    int             `json:"seed,omitempty"`
	TopN    int             `json:"top_n,omitempty"`
	Model   string          `json:"model,omitempty"`
	Query   string          `json:"query,omitempty"`
	Records []*RerankRecord `json:"records,omitempty"`
}

type RerankOutput

type RerankOutput struct {
	Records     []*RerankRecord `json:"records,omitempty"`
	Created     int64           `json:"created,omitempty"`
	DurationMs  int64           `json:"duration_ms"`
	KfpvCostUSD int64           `json:"kfpv_cost_usd"` // 1 usd -> 1000_000_000 kfpvusd
}

func Rerank

func Rerank(ctx context.Context, model, query string, inrecords []*RerankRecord) (RerankOutput, error)

type RerankRecord

type RerankRecord struct {
	Id      string  `json:"id,omitempty"`
	Title   string  `json:"title,omitempty"`
	Content string  `json:"content,omitempty"`
	Score   float32 `json:"score,omitempty"`
}

type RerankingResponse

type RerankingResponse struct {
	Records []*RerankRecord `json:"records,omitempty"`
	Created int64           `json:"created,omitempty"` // sec
	Object  string          `json:"object,omitempty"`
	Model   string          `json:"model,omitempty"`
	Usage   *Usage          `json:"usage,omitempty"`
	Error   *OpenAIError    `json:"error,omitempty"`
}

func RerankAPI

func RerankAPI(ctx context.Context, token string, payload []byte) (RerankingResponse, error)

type ResponseFormat

type ResponseFormat struct {
	Type       string                              `json:"type,omitempty"`
	JSONSchema *header.LLMResponseJSONSchemaFormat `json:"json_schema,omitempty"`
}

ResponseFormat specifies the format of the response.

type ToolCall

type ToolCall struct {
	ID       string       `json:"id"`
	Type     string       `json:"type"`
	Function ToolFunction `json:"function"`
}

ToolCall represents a tool call in an OpenAI response.

type ToolFunction

type ToolFunction struct {
	Name      string `json:"name"`
	Arguments string `json:"arguments"`
}

ToolFunction represents the function details in a tool call.

type TotalCost

type TotalCost struct {
	USD int64 `json:"usd"` // kfpvusd = usd*1_000_000_000 = fpvusd * 1000
}

type Usage

type Usage struct {
	PromptTokens        int64                `json:"prompt_tokens"`
	CompletionTokens    int64                `json:"completion_tokens"`
	TotalTokens         int64                `json:"total_tokens"`
	PromptTokensDetails *PromptTokensDetails `json:"prompt_tokens_details,omitempty"`

	KFpvCostUSD int64 `json:"kfpv_cost_usd"` // our fields not openai
	CostVND     int64 `json:"cost_vnd"`      // our fields not openai
}

Usage mimics the structure of the usage field in an OpenAI Chat Completion response.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL