Documentation
¶
Index ¶
Constants ¶
View Source
const ( GPT_35_TURBO_0125 = LLM("gpt-3.5-turbo-0125") GPT_35_TURBO_INSTRUCT = LLM("gpt-3.5-turbo-instruct") GPT_4 = LLM("gpt-4") GPT_4_32K = LLM("gpt-4-32k") GPT_4O = LLM("gpt-4o") GPT_4O_MINI = LLM("gpt-4o-mini") GPT_O1 = LLM("o1") GPT_O3_MINI = LLM("o3-mini") )
View Source
const Version = "openai/v0.2.1"
Variables ¶
View Source
var ( // Set OpenAI LLM // // This option is required. WithLLM = opts.ForType[Client, LLM]() // Config HTTP stack WithHTTP = opts.Use[Client](http.NewStack) // Config the host, api.openai.com is default WithHost = opts.ForType[Client, ø.Authority]() // Config API secret key WithSecret = opts.ForName[Client, string]("secret") // Set api secret from ~/.netrc file WithNetRC = opts.FMap(withNetRC) // Config the system role, the default one is `system`. // The role has been switch to `developer` on newer OpenAI LLMs WithRoleSystem = opts.ForName[Client, string]("roleSystem") )
Functions ¶
This section is empty.
Types ¶
type Client ¶
OpenAI client
func New ¶
Creates OpenAI Chat (completion) client.
By default OpenAI reads access token from `~/.netrc`, supply custom secret `WithSecret(secret string)` if needed.
The client is configurable using
WithSecret(secret string) WithNetRC(host string) WithModel(...) WithHTTP(opts ...http.Config)
func (*Client) Prompt ¶ added in v0.0.4
func (c *Client) Prompt(ctx context.Context, prompt []fmt.Stringer, opts ...chatter.Opt) (reply chatter.Reply, err error)
Send prompt
func (*Client) UsedInputTokens ¶ added in v0.0.5
func (*Client) UsedReplyTokens ¶ added in v0.0.5
Click to show internal directories.
Click to hide internal directories.