Documentation
¶
Overview ¶
Package provider provides the building blocks to create LLM API clients.
Index ¶
- Constants
- type ClientFunc
- type ClientOptions
- type Func
- func WithMaxToken(maxToken int) Func
- func WithModel(model string) Func
- func WithResponseBody(requestBody any) Func
- func WithSeed(seed int) Func
- func WithStream(stream bool) Func
- func WithSystemMessages(systemMessage ...string) Func
- func WithTemperature(temperature float64) Func
- func WithTopK(topK int) Func
- func WithTopP(topP float64) Func
- func WithUserMessages(userMessage ...string) Func
- type IMeta
- type IMetrics
- type IProvider
- type Map
- type Options
- type Provider
Constants ¶
const (
Type = "provider"
)
Type is the type of the entity regarding the framework. It is used to for example, to identify the entity in the logs, metrics, and for tracing.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type ClientFunc ¶
type ClientFunc func(o *ClientOptions) error
ClientFunc allows to set provider options.
func WithDefaulModel ¶ added in v0.0.2
func WithDefaulModel(model string) ClientFunc
WithDefaulModel sets the default model.
func WithEndpoint ¶
func WithEndpoint(endpoint string) ClientFunc
WithEndpoint sets the provider endpoint.
type ClientOptions ¶
type ClientOptions struct {
// Endpoint to reach the provider.
Endpoint string `json:"endpoint" validate:"required"`
// Model default model to be used.
Model string `json:"model,omitempty"`
// Token to authenticate against the provider.
Token string `json:"-"`
}
ClientOptions for the provider.
type Func ¶
Func allows to set options.
func WithResponseBody ¶ added in v0.0.2
WithResponseBody sets the responseBody option.
func WithSystemMessages ¶
WithSystemMessages sets the systemMessages option.
func WithTemperature ¶
WithTemperature sets the temperature option.
func WithUserMessages ¶
WithUserMessages sets the userMessages option.
type IMeta ¶
type IMeta interface {
GetClient() any
// GetLogger returns the logger.
GetLogger() sypl.ISypl
// GetName returns the provider name.
GetName() string
// GetType returns its type.
GetType() string
}
IMeta definition.
type IMetrics ¶
type IMetrics interface {
// GetCounterCompletion returns the completion metric.
GetCounterCompletion() *expvar.Int
// GetCounterCompletionFailed returns the failed completion metric.
GetCounterCompletionFailed() *expvar.Int
}
IMetrics definition.
type IProvider ¶
type IProvider interface {
IMeta
IMetrics
// Completion generates a completion using the provider API.
// Optionally pass WithResponseBody to unmarshal the response body.
// It will always return the original, unparsed response body, if no error.
//
// NOTE: Not all options are available for all providers.
Completion(ctx context.Context, options ...Func) (string, error)
}
IProvider defines what a provider does.
type Map ¶ added in v0.0.5
Map is a map of strgs.
func (Map) Completion ¶ added in v0.0.6
CompletionMany calls the Completion concurrently against all providers in the map. The response is mapped to the provider name.
type Options ¶
type Options struct {
// Model is the model to be used.
Model string `json:"model" validate:"required"`
// UserMessages is the user role messages.
UserMessages []string `json:"userMessages" validate:"required"`
// MaxTokens defines the max amount of tokens in the response. Default to 0
// which means no limit.
MaxTokens int `json:"maxToken,omitempty" validate:"gte=0"`
// ResponseBody is the request body.
ResponseBody any `json:"requestBody"`
// Seed the LLM will make a best effort to sample deterministically, such
// that repeated requests with the same seed and parameters should return
// the same result. Default to not set which means no determinism.
//
// NOTE: Determinism is not guaranteed.
Seed int `json:"seed,omitempty" validate:"gte=0"`
// Stream determines if the response should be streamed or not. Default to
// false which means not to stream.
Stream bool `json:"stream"`
// SystemMessages is the system role messages.
SystemMessages []string `json:"systemMessages,omitempty"`
// Temperature to use, between 0 and 2. Higher values like 0.8 will make the
// output more random, while lower values like 0.2 will make it more focused
// and deterministic. Default to 1.0 which means be creative.
//
// NOTE: It's generally recommended altering this OR temperature but not
// both!
Temperature float64 `json:"temperature,omitempty" validate:"gte=0"`
// TopK means the number of highest probability vocabulary tokens to keep
// for sampling, default is not set (0) which means no restrictions.
TopK int `json:"topK,omitempty" validate:"gte=0"`
// TopP is an alternative to sampling with temperature, called nucleus
// sampling, where the model considers the results of the tokens with top_p
// probability mass. So 0.1 means only the tokens comprising the top 10%
// probability mass are considered. Default to 1.0 which means consider all
// tokens.
//
// NOTE: It's generally recommended altering this OR temperature but not
// both!
TopP float64 `json:"topP,omitempty" validate:"gte=0"`
}
Options for operations.
func NewOptionsFrom ¶
NewOptionsFrom process, and validate against the default options.
type Provider ¶
type Provider struct {
// Logger.
Logger sypl.ISypl `json:"-" validate:"required"`
// Name of the provider type.
Name string `json:"name" validate:"required,lowercase,gte=1"`
// A provider may have the following...
// Endpoint to reach the provider.
Endpoint string `json:"endpoint,omitempty"`
// DefaultModel default model to be used.
DefaultModel string `json:"model,omitempty"`
// Token to authenticate against the provider.
Token string `json:"-"`
// contains filtered or unexported fields
}
Provider definition.
func New ¶
func New( name string, options ...ClientFunc, ) (*Provider, error)
New returns a new provider.
func (*Provider) GetCounterCompletion ¶
GetCounterCompletion returns the completion metric.
func (*Provider) GetCounterCompletionFailed ¶
GetCounterCompletionFailed returns the failed completion metric.
func (*Provider) GetCounterCounted ¶
GetCounterCounted returns the metric.