openai

package module
v0.0.0-...-7dd1616 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 26, 2025 License: MPL-2.0 Imports: 0 Imported by: 4

README

OpenAI Go Reference Go Report Card

An unofficial community-maintained CLI application for OpenAI.

Installation

$ go install github.com/picatz/openai/cmd/openai@latest

[!IMPORTANT] To use the CLI you must have a valid OPENAI_API_KEY environment variable set. You can get one here.

[!TIP] You can customize which model is used by setting the OPENAI_MODEL environment variable. The default is gpt-4o today, but it may change in the future.

Usage
$ openai --help
OpenAI CLI

Usage:
  openai [flags]
  openai [command]

Available Commands:
  assistant   Start an interactive assistant chat session
  chat        Chat with the OpenAI API
  completion  Generate the autocompletion script for the specified shell
  help        Help about any command
  image       Generate an image with DALL·E
  responses   Manage the OpenAI Responses API

Flags:
  -h, --help   help for openai

Use "openai [command] --help" for more information about a command.
$ openai assistant --help
Interact with the OpenAI API using the assistant API.

This can be used to create a temporary assistant, or interact with an existing assistant.

Usage:
  openai assistant [flags]
  openai assistant [command]

Examples:
  $ openai assistant      # create a temporary assistant and start chatting
  $ openai assistant chat # same as above
  $ openai assistant create --name "Example" --model "gpt-4-turbo-preview" --description "..." --instructions "..." --code-interpreter --retrieval
  $ openai assistant list
  $ openai assistant info <assistant-id>
  $ openai assistant chat <assistant-id>
  $ openai assistant delete <assistant-id>

Available Commands:
  chat        Start an interactive assistant chat session
  create      Create an assistant
  delete      Delete an assistant
  file        Manage assistant files
  info        Get information about an assistant
  list        List assistants
  update      Update an assistant

Flags:
  -h, --help   help for assistant

Use "openai assistant [command] --help" for more information about a command.

[!TIP]

If provided no arguments, the CLI will default to the assistant command with an ephemeral session, meaning messages and files will be deleted after exiting the session.

With Ollama

You can use the CLI with Ollama to use models that are run locally, such as IBM Granite.

$ brew install ollama
...
$ ollama serve &
...
$ ollama run granite3.1-dense:2b
...
$ OPENAI_MODEL="granite3.1-dense:2b" OPENAI_API_URL="http://localhost:11434/v1/" openai chat

Documentation

Overview

Package openai provides a client for the OpenAI API.

https://beta.openai.com/docs/api-reference

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Model

type Model = string

Model is a known OpenAI model identifier.

const (
	// ModelAda is the Ada model.
	//
	// Ada is usually the fastest model and can perform tasks like parsing text, address correction and certain kinds of classification
	// tasks that don’t require too much nuance. Ada’s performance can often be improved by providing more context.
	//
	// Good at: Parsing text, simple classification, address correction, keywords
	//
	// Note: Any task performed by a faster model like Ada can be performed by a more powerful model like Curie or Davinci.
	//
	// https://beta.openai.com/docs/models/ada
	ModelAda Model = "ada"

	// ModelBabbage is the Babbage model.
	//
	// Babbage can perform straightforward tasks like simple classification. It’s also quite capable when it comes to Semantic Search
	// ranking how well documents match up with search queries.
	//
	// Good at: Moderate classification, semantic search classification
	//
	// https://beta.openai.com/docs/models/babbage
	ModelBabbage Model = "babbage"

	// ModelCurie is the Curie model.
	//
	// Curie is extremely powerful, yet very fast. While Davinci is stronger when it comes to analyzing complicated text, Curie is q
	// uite capable for many nuanced tasks like sentiment classification and summarization. Curie is also quite good at answering
	// questions and performing Q&A and as a general service chatbot.
	//
	// Good at: Language translation, complex classification, text sentiment, summarization
	//
	// https://beta.openai.com/docs/models/curie
	ModelCurie Model = "curie"

	// ModelDavinci is the Davinci model.
	//
	// Davinci is the most capable model family and can perform any task the other models can perform and often with less instruction.
	// For applications requiring a lot of understanding of the content, like summarization for a specific audience and creative content
	// generation, Davinci is going to produce the best results. These increased capabilities require more compute resources, so Davinci
	// costs more per API call and is not as fast as the other models.
	//
	// Another area where Davinci shines is in understanding the intent of text. Davinci is quite good at solving many kinds of logic problems
	// and explaining the motives of characters. Davinci has been able to solve some of the most challenging AI problems involving cause and effect.
	//
	// Good at: Complex intent, cause and effect, summarization for audience
	//
	// https://beta.openai.com/docs/models/davinci
	ModelDavinci Model = "davinci"

	// Most capable GPT-3 model. Can do any task the other models can do, often with higher quality, longer output and better instruction-following.
	// Also supports inserting completions within text.
	ModelTextDavinciEdit003 Model = "text-davinci-003"

	// Very capable, but faster and lower cost than Davinci.
	ModelTextCurie001 Model = "text-curie-001"

	// Capable of straightforward tasks, very fast, and lower cost.
	ModelBabbage001 Model = "text-babbage-001"

	// Capable of very simple tasks, usually the fastest model in the GPT-3 series, and lowest cost.
	ModelAda001 Model = "text-ada-001"

	// Most capable Codex model. Particularly good at translating natural language to code. In addition to completing code, also supports inserting completions within code.
	ModelCodeDavinci002 Model = "code-davinci-002"

	// Almost as capable as Davinci Codex, but slightly faster. This speed advantage may make it preferable for real-time applications.
	ModelCodeCushman001 Model = "code-cushman-001"

	// Used for the CreateEdit API endpoint.
	ModelTextDavinciEdit001 Model = "text-davinci-edit-001"
	ModelCodeDavinciEdit001 Model = "code-davinci-edit-001"

	// https://platform.openai.com/docs/guides/embeddings/embedding-models
	ModelTextEmbeddingAda001 Model = "text-embedding-ada-001"

	// This is the previously recommend model for nearly all embedding use cases.
	//
	// https://openai.com/blog/new-and-improved-embedding-model
	ModelTextEmbeddingAda002 Model = "text-embedding-ada-002"

	// These models are the latest and greatest for embedding use cases.
	//
	// https://openai.com/blog/new-embedding-models-and-api-updates
	ModelTextEmbedding3Small Model = "text-embedding-3-small"
	ModelTextEmbedding3Large Model = "text-embedding-3-large"

	// https://platform.openai.com/docs/api-reference/chat/create#chat/create-model
	ModelGPT35Turbo             Model = "gpt-3.5-turbo"
	ModelGPT35Turbo0301         Model = "gpt-3.5-turbo-0301"
	ModelGPT35Turbo0613         Model = "gpt-3.5-turbo-0613"
	ModelGPT35Turbo1106         Model = "gpt-3.5-turbo-1106"
	ModelGPT35Turbo16k          Model = "gpt-3.5-turbo-16k"
	ModelGPT35Turbo16k0613      Model = "gpt-3.5-turbo-16k-0613"
	ModelGPT35TurboInstruct     Model = "gpt-3.5-turbo-instruct"
	ModelGPT35TurboInstruct0914 Model = "gpt-3.5-turbo-instruct-0914"
	ModelGPT35Turbo0125         Model = "gpt-3.5-turbo-0125"

	ModelGPT4              Model = "gpt-4"
	ModelGPT40314          Model = "gpt-4-0314"
	ModelGPT40613          Model = "gpt-4-0613"
	ModelGPT432K           Model = "gpt-4-32k"
	ModelGPT432K0314       Model = "gpt-4-32k-0314"
	ModelGPT41106Previw    Model = "gpt-4-1106-preview"
	ModelGPT4VisionPreview Model = "gpt-4-vision-preview"
	ModelGPT40125Preview   Model = "gpt-4-0125-preview"
	ModelGPT4TurboPreview  Model = "gpt-4-turbo-preview"

	ModelWhisper1 Model = "whisper-1"

	ModelTTS1       Model = "tts-1"
	ModelTTS11106   Model = "tts-1-1106"
	ModelTTS1HD     Model = "tts-1-hd"
	ModelTTS1HD1106 Model = "tts-1-hd-1106"

	ModelTextModeration007    Model = "text-moderation-007"
	ModelTextModerationLatest Model = "text-moderation-latest"
	ModelTextModerationStable Model = "text-moderation-stable"

	ModelDallE2 Model = "dall-e-2"
	ModelDallE3 Model = "dall-e-3"
)

https://beta.openai.com/docs/models/finding-the-right-model

Directories

Path Synopsis
cmd
internal
chat/storage
Package storage provides a pluggable storage layer for the chat application.
Package storage provides a pluggable storage layer for the chat application.
responses
Package responses implements a minimal client for the OpenAI responses API.
Package responses implements a minimal client for the OpenAI responses API.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL