libaiac

package
v4.2.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 6, 2024 License: Apache-2.0 Imports: 9 Imported by: 0

Documentation

Index

Constants

View Source
const (
	DefaultAWSRegion  = "us-east-1"
	DefaultAWSProfile = "default"
)

Variables

View Source
var Version = "development"

Version contains aiac's version string

Functions

This section is empty.

Types

type BackendName

type BackendName string

BackendName is a const type used for identifying backends, a.k.a LLM providers.

const (
	// BackendOpenAI represents the OpenAI LLM provider.
	BackendOpenAI BackendName = "openai"

	// BackendBedrock represents the Amazon Bedrock LLM provider.
	BackendBedrock BackendName = "bedrock"

	// BackendOllama represents the Ollama LLM provider.
	BackendOllama BackendName = "ollama"
)

func (*BackendName) Decode

func (b *BackendName) Decode(ctx *kong.DecodeContext) error

Decode is used by the kong library to map CLI-provided values to the Model type

type Client

type Client struct {
	// Backend is the backend implementation in charge of communicating with
	// the relevant APIs.
	Backend types.Backend
}

Client provides the main interface for using libaiac. It exposes all the capabilities of the chosen backend.

func NewClient

func NewClient(opts *NewClientOptions) *Client

NewClient constructs a new Client object.

func (*Client) Chat

func (client *Client) Chat(model types.Model) types.Conversation

Chat initiates a chat conversation with the provided chat model. Returns a Conversation object with which messages can be sent and received.

func (*Client) Complete

func (client *Client) Complete(
	ctx context.Context,
	model types.Model,
	prompt string,
) (types.Response, error)

Complete issues a request to a code completion model in the backend. with the provided string prompt.

func (*Client) DefaultModel

func (client *Client) DefaultModel() types.Model

DefaultModel returns the default model used by the chosen backend implementation.

func (*Client) GenerateCode

func (client *Client) GenerateCode(
	ctx context.Context,
	model types.Model,
	prompt string,
	msgs ...types.Message,
) (res types.Response, err error)

GenerateCode sends the provided prompt to the backend and returns a Response object. It is a convenience wrapper around client.Complete (for text completion models) and client.Chat.Send (for chat models).

func (*Client) ListModels

func (client *Client) ListModels() []types.Model

ListModels returns a list of all the models supported by the chosen backend implementation.

type NewClientOptions

type NewClientOptions struct {
	// Backend is the name of the backend to use. Use the available constants,
	// e.g. BackendOpenAI, BackendBedrock or BackendOllama. Defaults to openai.
	Backend BackendName

	// ApiKey is the OpenAI API key. Required if using OpenAI.
	ApiKey string

	// URL can be used to change the OpenAPI endpoint, for example in order to
	// use Azure OpenAI services. Defaults to OpenAI's standard API endpoint.
	URL string

	// APIVersion is the version of the OpenAI API to use. Unset by default.
	APIVersion string

	// AWSRegion is the name of the region to use. Defaults to "us-east-1".
	AWSRegion string

	// AWSProfile is the name of the AWS profile to use. Defaults to "default".
	AWSProfile string

	// OllamaURL is the URL to the Ollama API server, including the /api path
	// prefix. Defaults to http://localhost:11434/api.
	OllamaURL string
}

NewClientOptions contains all the parameters accepted by the NewClient constructor.

Directories

Path Synopsis

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL