Documentation
¶
Index ¶
Constants ¶
View Source
const ( GPT_35_TURBO_0125 = LLM("gpt-3.5-turbo-0125") GPT_35_TURBO_INSTRUCT = LLM("gpt-3.5-turbo-instruct") GPT_4 = LLM("gpt-4") GPT_4_32K = LLM("gpt-4-32k") GPT_4O = LLM("gpt-4o") GPT_4O_MINI = LLM("gpt-4o-mini") GPT_O1 = LLM("o1") GPT_O3_MINI = LLM("o3-mini") )
View Source
const Version = "llm/openai/v0.5.0"
Variables ¶
View Source
var ( // Set OpenAI LLM // // This option is required. WithLLM = opts.ForType[Client, LLM]() // Config HTTP stack WithHTTP = opts.Use[Client](http.NewStack) // Config the host, api.openai.com is default WithHost = opts.ForType[Client, ø.Authority]() // Config API secret key WithSecret = opts.ForName[Client, string]("secret") // Set api secret from ~/.netrc file WithNetRC = opts.FMap(withNetRC) // Config the system role, the default one is `system`. // The role has been switch to `developer` on newer OpenAI LLMs WithRoleSystem = opts.ForName[Client, string]("roleSystem") )
Functions ¶
This section is empty.
Types ¶
type Client ¶
OpenAI client
func New ¶
Creates OpenAI Chat (completion) client.
By default OpenAI reads access token from `~/.netrc`, supply custom secret `WithSecret(secret string)` if needed.
The client is configurable using
WithSecret(secret string) WithNetRC(host string) WithModel(...) WithHTTP(opts ...http.Config)
Click to show internal directories.
Click to hide internal directories.