Documentation
¶
Index ¶
- func AppendCustomAttributesToEvent(cw *ClientWrapper, data map[string]interface{}) map[string]interface{}
- func GetInput(any interface{}) any
- func NRCreateChatCompletionMessage(txn *newrelic.Transaction, app *newrelic.Application, ...)
- func NRCreateChatCompletionMessageInput(txn *newrelic.Transaction, app *newrelic.Application, ...) int
- func NRCreateChatCompletionMessageStream(app *newrelic.Application, uuid uuid.UUID, sw *ChatCompletionStreamWrapper, ...)
- func NRCreateEmbedding(cw *ClientWrapper, req openai.EmbeddingRequest, app *newrelic.Application) (openai.EmbeddingResponse, error)
- func TokenCountingHelper(app *newrelic.Application, message openai.ChatCompletionMessage, model string) (numTokens int, tokensCounted bool)
- type ChatCompletionResponseWrapper
- type ChatCompletionStreamWrapper
- type ClientWrapper
- type ConfigWrapper
- type OpenAIClient
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func AppendCustomAttributesToEvent ¶
func AppendCustomAttributesToEvent(cw *ClientWrapper, data map[string]interface{}) map[string]interface{}
func GetInput ¶
func GetInput(any interface{}) any
If multiple messages are sent, only the first message is used for the "content" field
func NRCreateChatCompletionMessage ¶
func NRCreateChatCompletionMessage(txn *newrelic.Transaction, app *newrelic.Application, resp openai.ChatCompletionResponse, uuid uuid.UUID, cw *ClientWrapper, sequence int, req openai.ChatCompletionRequest)
NRCreateChatCompletionMessage captures the completion response messages and records a custom event in New Relic for each message. The completion response messages are the responses from the model after the request messages have been sent and logged in NRCreateChatCompletionMessageInput. The sequence of the messages is calculated by logging each of the request messages first, then incrementing the sequence for each response message. The token count is calculated for each message and added to the custom event if the token count callback is set If not, no token count is added to the custom event
func NRCreateChatCompletionMessageInput ¶
func NRCreateChatCompletionMessageInput(txn *newrelic.Transaction, app *newrelic.Application, req openai.ChatCompletionRequest, inputuuid uuid.UUID, cw *ClientWrapper) int
Captures initial request messages and records a custom event in New Relic for each message similarly to NRCreateChatCompletionMessage, but only for the request messages Returns the sequence of the messages sent in the request which is used to calculate the sequence in the response messages
func NRCreateChatCompletionMessageStream ¶
func NRCreateChatCompletionMessageStream(app *newrelic.Application, uuid uuid.UUID, sw *ChatCompletionStreamWrapper, cw *ClientWrapper, sequence int)
NRCreateChatCompletionMessageStream is identical to NRCreateChatCompletionMessage, but for streaming responses. Gets invoked only when the stream is closed
func NRCreateEmbedding ¶
func NRCreateEmbedding(cw *ClientWrapper, req openai.EmbeddingRequest, app *newrelic.Application) (openai.EmbeddingResponse, error)
NRCreateEmbedding is a wrapper for the OpenAI CreateEmbedding method. If AI Monitoring is disabled, the wrapped function will still call the OpenAI CreateEmbedding method and return the response with no New Relic instrumentation
func TokenCountingHelper ¶
func TokenCountingHelper(app *newrelic.Application, message openai.ChatCompletionMessage, model string) (numTokens int, tokensCounted bool)
Calculates tokens using the LLmTokenCountCallback In order to calculate total tokens of a message, we need to factor in the Content, Role, and Name (if it exists)
Types ¶
type ChatCompletionResponseWrapper ¶
type ChatCompletionResponseWrapper struct { ChatCompletionResponse openai.ChatCompletionResponse TraceID string }
Wrapper for ChatCompletionResponse that is returned from NRCreateChatCompletion. It also includes the TraceID of the transaction for linking a chat response with it's feedback
func NRCreateChatCompletion ¶
func NRCreateChatCompletion(cw *ClientWrapper, req openai.ChatCompletionRequest, app *newrelic.Application) (ChatCompletionResponseWrapper, error)
NRCreateChatCompletion is a wrapper for the OpenAI CreateChatCompletion method. If AI Monitoring is disabled, the wrapped function will still call the OpenAI CreateChatCompletion method and return the response with no New Relic instrumentation Calls NRCreateChatCompletionSummary to capture the request data and response data Returns a ChatCompletionResponseWrapper with the response and the TraceID of the transaction The trace ID is used to link the chat response with its feedback, with a call to SendFeedback() Otherwise, the response is the same as the OpenAI CreateChatCompletion method. It can be accessed by calling resp.ChatCompletionResponse
func NRCreateChatCompletionSummary ¶
func NRCreateChatCompletionSummary(txn *newrelic.Transaction, app *newrelic.Application, cw *ClientWrapper, req openai.ChatCompletionRequest) ChatCompletionResponseWrapper
NRCreateChatCompletionSummary captures the request data for a chat completion request A new segment is created for the chat completion request, and the response data is timed and captured Custom attributes are added to the event if they exist from client.AddCustomAttributes() After closing out the custom event for the chat completion summary, the function then calls NRCreateChatCompletionMessageInput/NRCreateChatCompletionMessage to capture the request messages
type ChatCompletionStreamWrapper ¶
type ChatCompletionStreamWrapper struct { StreamingData map[string]interface{} TraceID string // contains filtered or unexported fields }
Wrapper for ChatCompletionStream that is returned from NRCreateChatCompletionStream Contains attributes that get populated during the streaming process
func NRCreateChatCompletionStream ¶
func NRCreateChatCompletionStream(cw *ClientWrapper, ctx context.Context, req openai.ChatCompletionRequest, app *newrelic.Application) (*ChatCompletionStreamWrapper, error)
Similar to NRCreateChatCompletionSummary, but for streaming responses Returns a custom wrapper with a stream that can be used to receive messages Example Usage:
ctx := context.Background() stream, err := nropenai.NRCreateChatCompletionStream(client, ctx, req, app) if err != nil { panic(err) } for { var response openai.ChatCompletionStreamResponse response, err = stream.Recv() if errors.Is(err, io.EOF) { fmt.Println("\nStream finished") break } if err != nil { fmt.Printf("\nStream error: %v\n", err) return } fmt.Printf(response.Choices[0].Delta.Content) } stream.Close()
It is important to call stream.Close() after the stream has been used, as it will close the stream and send the event to New Relic. Additionally, custom attributes can be added to the client using client.AddCustomAttributes(map[string]interface{}) just like in NRCreateChatCompletionSummary
func (*ChatCompletionStreamWrapper) Close ¶
func (w *ChatCompletionStreamWrapper) Close()
Close the stream and send the event to New Relic
func (*ChatCompletionStreamWrapper) Recv ¶
func (w *ChatCompletionStreamWrapper) Recv() (openai.ChatCompletionStreamResponse, error)
Wrapper for OpenAI Streaming Recv() method Captures the response messages as they are received in the wrapper Once the stream is closed, the Close() method is called and sends the captured data to New Relic
type ClientWrapper ¶
type ClientWrapper struct { Client OpenAIClient // Set of Custom Attributes that get tied to all LLM Events CustomAttributes map[string]interface{} }
Wrapper for OpenAI Client with Custom Attributes that can be set for all LLM Events
func NRNewClientWithConfig ¶
func NRNewClientWithConfig(config *ConfigWrapper) *ClientWrapper
NewClientWithConfig creates new OpenAI API client for specified config.
func (*ClientWrapper) AddCustomAttributes ¶
func (cw *ClientWrapper) AddCustomAttributes(attributes map[string]interface{})
Adds Custom Attributes to the ClientWrapper
type ConfigWrapper ¶
type ConfigWrapper struct {
Config *openai.ClientConfig
}
Wrapper for OpenAI Configuration
func NRDefaultAzureConfig ¶
func NRDefaultAzureConfig(apiKey, baseURL string) *ConfigWrapper
Azure Config
type OpenAIClient ¶
type OpenAIClient interface { CreateChatCompletion(ctx context.Context, request openai.ChatCompletionRequest) (response openai.ChatCompletionResponse, err error) CreateChatCompletionStream(ctx context.Context, request openai.ChatCompletionRequest) (stream *openai.ChatCompletionStream, err error) CreateEmbeddings(ctx context.Context, conv openai.EmbeddingRequestConverter) (res openai.EmbeddingResponse, err error) }
OpenAIClient is any type that can invoke OpenAI model with a request.