ldai

package module
v0.6.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 7, 2025 License: Apache-2.0 Imports: 10 Imported by: 0

README

LaunchDarkly Server-side AI SDK for Go

⛔️⛔️⛔️⛔️

[!CAUTION] This library is a alpha version and should not be considered ready for production use while this message is visible.

☝️☝️☝️☝️☝️☝️

Actions Status

LaunchDarkly overview

LaunchDarkly is a feature management platform that serves trillions of feature flags daily to help teams build better software, faster. Get started using LaunchDarkly today!

Twitter Follow

Getting started

Import the module:

import (
	ld "github.com/launchdarkly/go-server-sdk/v7"
	"github.com/launchdarkly/go-server-sdk/ldai"
)

Configure the base LaunchDarkly Server SDK:

sdkClient, _ = ld.MakeClient("your-sdk-key", 5*time.Second)

Instantiate the AI client, passing in the base Server SDK:

aiClient, err := ldai.NewClient(sdkClient)

Fetch a model configuration for a specific LaunchDarkly context:

// The default value 'ldai.Disabled()' be returned if LaunchDarkly is unavailable or the config 
// cannot be fetched. To customize the default value, use ldai.NewConfig().
config, tracker := aiClient.Config("your-model-key", ldcontext.New("user-key"), ldai.Disabled(), nil)

// Access the methods on config, and optionally use the returned tracker to generate analytic events
// related to usage of the model config. 

Learn more

Read our documentation for in-depth instructions on configuring and using LaunchDarkly. You can also head straight to the complete reference guide for this SDK.

Contributing

We encourage pull requests and other contributions from the community. Check out our contributing guidelines for instructions on how to contribute to this library.

About LaunchDarkly

  • LaunchDarkly is a continuous delivery platform that provides feature flags as a service and allows developers to iterate quickly and safely. We allow you to easily flag your features and manage them from the LaunchDarkly dashboard. With LaunchDarkly, you can:
    • Roll out a new feature to a subset of your users (like a group of users who opt-in to a beta tester group), gathering feedback and bug reports from real-world use cases.
    • Gradually roll out a feature to an increasing percentage of users, and track the effect that the feature has on key metrics (for instance, how likely is a user to complete a purchase if they have feature A versus feature B?).
    • Turn off a feature that you realize is causing performance problems in production, without needing to re-deploy, or even restart the application with a changed configuration file.
    • Grant access to certain features based on user attributes, like payment plan (eg: users on the ‘gold’ plan get access to more features than users in the ‘silver’ plan). Disable parts of your application to facilitate maintenance, without taking everything offline.
  • LaunchDarkly provides feature flag SDKs for a wide variety of languages and technologies. Read our documentation for a complete list.
  • Explore LaunchDarkly

Documentation

Overview

Package ldai contains an AI SDK suitable for usage with generative AI applications.

Index

Constants

View Source
const Version = "0.6.0" // {{ x-release-please-version }}

Version is the current version string of the ldai package. This is updated by our release scripts.

Variables

This section is empty.

Functions

This section is empty.

Types

type Client

type Client struct {
	// contains filtered or unexported fields
}

Client is the main entrypoint for the AI SDK. A client can be used to obtain an AI config from LaunchDarkly. Unless otherwise noted, the Client's method are not safe for concurrent use.

func NewClient

func NewClient(sdk ServerSDK) (*Client, error)

NewClient creates a new AI Client. The provided SDK interface must not be nil. The client will use the provided SDK's loggers to log warnings and errors.

func (*Client) Config

func (c *Client) Config(
	key string,
	context ldcontext.Context,
	defaultValue Config,
	variables map[string]interface{},
) (Config, *Tracker)

Config evaluates an AI config named by a given key for the given context.

The config's messages will undergo Mustache template interpolation using the provided variables, which may be nil. If the config cannot be evaluated or LaunchDarkly is unreachable, the default value is returned. Note that the messages in the default will not undergo template interpolation.

To send analytic events to LaunchDarkly related to the AI config, call methods on the returned Tracker.

type Config

type Config struct {
	// contains filtered or unexported fields
}

Config represents an AI config.

func Disabled

func Disabled() Config

Disabled is a helper that returns a built Config that is disabled and contains no messages.

func (*Config) AsLdValue

func (c *Config) AsLdValue() ldvalue.Value

AsLdValue is used internally.

func (*Config) CustomModelParam

func (c *Config) CustomModelParam(key string) (ldvalue.Value, bool)

CustomModelParam returns the custom model parameter named by key. The second parameter is true if the key exists.

func (*Config) Enabled

func (c *Config) Enabled() bool

Enabled returns whether the config is enabled.

func (*Config) Messages

func (c *Config) Messages() []datamodel.Message

Messages returns the messages defined by the config. The series of messages may be passed to an AI model provider.

func (*Config) ModelName added in v0.2.0

func (c *Config) ModelName() string

ModelName returns the model name associated with the config.

func (*Config) ModelParam

func (c *Config) ModelParam(key string) (ldvalue.Value, bool)

ModelParam returns the model parameter named by key. The second parameter is true if the key exists.

func (*Config) ProviderName added in v0.2.0

func (c *Config) ProviderName() string

ProviderName returns the provider name associated with the config.

func (*Config) VariationKey added in v0.3.0

func (c *Config) VariationKey() string

VariationKey is used internally by LaunchDarkly.

func (*Config) Version added in v0.6.0

func (c *Config) Version() int

Version is used internally by LaunchDarkly.

type ConfigBuilder

type ConfigBuilder struct {
	// contains filtered or unexported fields
}

ConfigBuilder is used to define a default AI config, returned when LaunchDarkly is unreachable or there is an error evaluating the config.

func NewConfig

func NewConfig() *ConfigBuilder

NewConfig returns a new ConfigBuilder. By default, the config is disabled.

func (*ConfigBuilder) Build

func (cb *ConfigBuilder) Build() Config

Build creates a Config from the current builder state.

func (*ConfigBuilder) Disable

func (cb *ConfigBuilder) Disable() *ConfigBuilder

Disable disables the config.

func (*ConfigBuilder) Enable

func (cb *ConfigBuilder) Enable() *ConfigBuilder

Enable enables the config.

func (*ConfigBuilder) WithCustomModelParam

func (cb *ConfigBuilder) WithCustomModelParam(key string, value ldvalue.Value) *ConfigBuilder

WithCustomModelParam sets a custom model parameter named by key to the given value. If the key already exists, it will be overwritten.

func (*ConfigBuilder) WithEnabled

func (cb *ConfigBuilder) WithEnabled(enabled bool) *ConfigBuilder

WithEnabled sets whether the config is enabled. See also Enable and Disable.

func (*ConfigBuilder) WithMessage

func (cb *ConfigBuilder) WithMessage(content string, role datamodel.Role) *ConfigBuilder

WithMessage appends a message to the config with the given role.

func (*ConfigBuilder) WithModelName added in v0.2.0

func (cb *ConfigBuilder) WithModelName(modelName string) *ConfigBuilder

WithModelName sets the model name associated with the config.

func (*ConfigBuilder) WithModelParam

func (cb *ConfigBuilder) WithModelParam(key string, value ldvalue.Value) *ConfigBuilder

WithModelParam sets a model parameter named by key to the given value. If the key already exists, it will be overwritten. Model parameters are generally set by LaunchDarkly; for custom parameters not recognized by LaunchDarkly, use WithModelCustomParam.

func (*ConfigBuilder) WithProviderName added in v0.2.0

func (cb *ConfigBuilder) WithProviderName(providerName string) *ConfigBuilder

WithProviderName sets the provider name associated with the config.

type EventSink

type EventSink interface {
	// TrackMetric sends a named analytic event to LaunchDarkly relevant to a particular context, and containing a
	// metric value and additional data.
	TrackMetric(
		eventName string,
		context ldcontext.Context,
		metricValue float64,
		data ldvalue.Value,
	) error
}

EventSink represents the Tracker's requirements for delivering analytic events. This is generally satisfied by the LaunchDarkly SDK's TrackMetric method.

type Feedback

type Feedback string

Feedback represents the feedback provided by a user for a model evaluation.

const (
	// FeedbackPositive is positive feedback.
	FeedbackPositive Feedback = "positive"
	// FeedbackNegative is negative feedback.
	FeedbackNegative Feedback = "negative"
)

type Metrics

type Metrics struct {
	// Latency is the latency of the request.
	Latency time.Duration
	// TimeToFirstToken is the time to the first token of the streamed response.
	TimeToFirstToken time.Duration
}

Metrics represents the metrics returned by a model provider for a specific request.

type ProviderResponse

type ProviderResponse struct {
	// Usage is the token usage.
	Usage TokenUsage
	// Metrics is the request metrics.
	Metrics Metrics
}

ProviderResponse represents the response from a model provider for a specific request.

type ServerSDK

type ServerSDK interface {
	JSONVariation(
		key string,
		context ldcontext.Context,
		defaultVal ldvalue.Value,
	) (ldvalue.Value, error)
	Loggers() interfaces.LDLoggers
	TrackMetric(
		eventName string,
		context ldcontext.Context,
		metricValue float64,
		data ldvalue.Value,
	) error
}

ServerSDK defines the required methods for the AI SDK to interact with LaunchDarkly. These methods are satisfied by the LaunchDarkly Go Server SDK.

type Stopwatch

type Stopwatch interface {
	// Start starts the stopwatch.
	Start()
	// Stop stops the stopwatch and returns the duration since Start was called.
	Stop() time.Duration
}

Stopwatch is used to measure the duration of a task. Start will always be called before Stop. If an implementation is not provided, the Tracker uses a default implementation that delegates to time.Now and time.Since.

type TokenUsage

type TokenUsage struct {
	// Total is the total number of tokens used.
	Total int
	// Input is the number of input tokens used.
	Input int
	// Output is the number of output tokens used.
	Output int
}

TokenUsage represents the token usage returned by a model provider for a specific request.

func (TokenUsage) Set

func (t TokenUsage) Set() bool

Set returns true if any of the fields are non-zero.

type Tracker

type Tracker struct {
	// contains filtered or unexported fields
}

Tracker is used to track metrics for AI config evaluation. Unless otherwise noted, the Tracker's method are not safe for concurrent use.

func (*Tracker) TrackDuration

func (t *Tracker) TrackDuration(dur time.Duration) error

TrackDuration tracks the duration of a task. For example, the duration of a model evaluation request may be tracked here. See also TrackRequest. The duration in milliseconds must fit within a float64.

func (*Tracker) TrackError added in v0.4.0

func (t *Tracker) TrackError() error

TrackError tracks an unsuccessful model evaluation.

func (*Tracker) TrackFeedback

func (t *Tracker) TrackFeedback(feedback Feedback) error

TrackFeedback tracks the feedback provided by a user for a model evaluation. If the feedback is not FeedbackPositive or FeedbackNegative, returns an error and does not track anything.

func (*Tracker) TrackRequest

func (t *Tracker) TrackRequest(task func(c *Config) (ProviderResponse, error)) (ProviderResponse, error)

TrackRequest tracks metrics for a model evaluation request. The task function should return a ProviderResponse which can be used to specify request metrics and token usage. All fields of the returned ProviderResponse are optional.

The task function will be passed the current AI config, which can be used to obtain any parameters or messages relevant to the request.

If the task returns an error, then the request is not considered successful and no metrics are tracked. Otherwise, the following metrics are tracked:

  1. Successful model evaluation.
  2. Any metrics that were that set in the ProviderResponse 2a) If Latency was not set in the ProviderResponse's Metrics field, an automatically measured duration.
  3. Any token usage that was set in the ProviderResponse.

func (*Tracker) TrackSuccess

func (t *Tracker) TrackSuccess() error

TrackSuccess tracks a successful model evaluation.

func (*Tracker) TrackTimeToFirstToken added in v0.5.0

func (t *Tracker) TrackTimeToFirstToken(dur time.Duration) error

TrackTimeToFirstToken tracks the time to the first token of the streamed response.

func (*Tracker) TrackUsage

func (t *Tracker) TrackUsage(usage TokenUsage) error

TrackUsage tracks the token usage for a model evaluation.

Directories

Path Synopsis

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL