Documentation ¶
Overview ¶
Package nrawsbedrock instruments AI model invocation requests made by the https://github.com/aws/aws-sdk-go-v2/service/bedrockruntime library.
Specifically, this provides instrumentation for the InvokeModel and InvokeModelWithResponseStream bedrock client API library functions.
To use this integration, enable the New Relic AIMonitoring configuration options in your application, import this integration, and use the model invocation calls from this library in place of the corresponding ones from the AWS Bedrock runtime library, as documented below.
The relevant configuration options are passed to the NewApplication function and include
ConfigAIMonitoringEnabled(true), // enable (or disable if false) this integration ConfigAIMonitoringStreamingEnabled(true), // enable instrumentation of streaming invocations ConfigAIMonitoringRecordContentEnabled(true), // include input/output data in instrumentation
Currently, the following must also be set for AIM reporting to function correctly:
ConfigCustomInsightsEventsEnabled(true) // (the default) ConfigHighSecurityEnabled(false) // (the default)
Or, if ConfigFromEnvironment() is included in your configuration options, the above configuration options may be specified using these environment variables, respectively:
NEW_RELIC_AI_MONITORING_ENABLED=true NEW_RELIC_AI_MONITORING_STREAMING_ENABLED=true NEW_RELIC_AI_MONITORING_RECORD_CONTENT_ENABLED=true NEW_RELIC_HIGH_SECURITY=false
The values for these variables may be any form accepted by strconv.ParseBool (e.g., 1, t, T, true, TRUE, True, 0, f, F, false, FALSE, or False).
See example/main.go for a working sample.
Index ¶
- Variables
- func InvokeModel(app *newrelic.Application, brc Modeler, ctx context.Context, ...) (*bedrockruntime.InvokeModelOutput, error)
- func InvokeModelWithAttributes(app *newrelic.Application, brc Modeler, ctx context.Context, ...) (*bedrockruntime.InvokeModelOutput, error)
- func ProcessModelWithResponseStream(app *newrelic.Application, brc Modeler, ctx context.Context, ...) error
- func ProcessModelWithResponseStreamAttributes(app *newrelic.Application, brc Modeler, ctx context.Context, ...) error
- type Modeler
- type ResponseStream
Constants ¶
This section is empty.
Variables ¶
var (
ErrMissingResponseData = errors.New("missing response data")
)
Functions ¶
func InvokeModel ¶
func InvokeModel(app *newrelic.Application, brc Modeler, ctx context.Context, params *bedrockruntime.InvokeModelInput, optFns ...func(*bedrockruntime.Options)) (*bedrockruntime.InvokeModelOutput, error)
InvokeModel provides an instrumented interface through which to call the AWS Bedrock InvokeModel function. Where you would normally invoke the InvokeModel method on a bedrockruntime.Client value b from AWS as:
b.InvokeModel(c, p, f...)
You instead invoke the New Relic InvokeModel function as:
nrbedrock.InvokeModel(app, b, c, p, f...)
where app is the New Relic Application value returned from NewApplication when you started your application. If you start a transaction and add it to the passed context value c in the above invocation, the instrumentation will be recorded on that transaction, including a segment for the Bedrock call itself. If you don't, a new transaction will be started for you, which will be terminated when the InvokeModel function exits.
If the transaction is unable to be created or used, the Bedrock call will be made anyway, without instrumentation.
func InvokeModelWithAttributes ¶
func InvokeModelWithAttributes(app *newrelic.Application, brc Modeler, ctx context.Context, params *bedrockruntime.InvokeModelInput, attrs map[string]any, optFns ...func(*bedrockruntime.Options)) (*bedrockruntime.InvokeModelOutput, error)
InvokeModelWithAttributes is identical to InvokeModel except for the addition of the attrs parameter, which is a map of strings to values of any type. This map holds any custom attributes you wish to add to the reported metrics relating to this model invocation.
Each key in the attrs map must begin with "llm."; if any of them do not, "llm." is automatically prepended to the attribute key before the metrics are sent out.
We recommend including at least "llm.conversation_id" in your attributes.
func ProcessModelWithResponseStream ¶
func ProcessModelWithResponseStream(app *newrelic.Application, brc Modeler, ctx context.Context, callback func([]byte) error, params *bedrockruntime.InvokeModelWithResponseStreamInput, optFns ...func(*bedrockruntime.Options)) error
ProcessModelWithResponseStream works just like InvokeModelWithResponseStream, except that it handles all the stream processing automatically for you. For each event received from the response stream, it will invoke the callback function you pass into the function call so that your application can act on the response data. When the stream is complete, the ProcessModelWithResponseStream call will return.
If your callback function returns an error, the processing of the response stream will terminate at that point.
func ProcessModelWithResponseStreamAttributes ¶
func ProcessModelWithResponseStreamAttributes(app *newrelic.Application, brc Modeler, ctx context.Context, callback func([]byte) error, params *bedrockruntime.InvokeModelWithResponseStreamInput, attrs map[string]any, optFns ...func(*bedrockruntime.Options)) error
ProcessModelWithResponseStreamAttributes is identical to ProcessModelWithResponseStream except that it adds the attrs parameter, which is a map of strings to values of any type. This map holds any custom attributes you wish to add to the reported metrics relating to this model invocation.
Each key in the attrs map must begin with "llm."; if any of them do not, "llm." is automatically prepended to the attribute key before the metrics are sent out.
We recommend including at least "llm.conversation_id" in your attributes.
Types ¶
type Modeler ¶ added in v1.1.0
type Modeler interface { InvokeModel(context.Context, *bedrockruntime.InvokeModelInput, ...func(*bedrockruntime.Options)) (*bedrockruntime.InvokeModelOutput, error) InvokeModelWithResponseStream(context.Context, *bedrockruntime.InvokeModelWithResponseStreamInput, ...func(*bedrockruntime.Options)) (*bedrockruntime.InvokeModelWithResponseStreamOutput, error) }
Modeler is any type that can invoke Bedrock models (e.g., bedrockruntime.Client).
type ResponseStream ¶
type ResponseStream struct { // The model output Response *bedrockruntime.InvokeModelWithResponseStreamOutput // contains filtered or unexported fields }
ResponseStream tracks the model invocation throughout its lifetime until all stream events are processed.
func InvokeModelWithResponseStream ¶
func InvokeModelWithResponseStream(app *newrelic.Application, brc Modeler, ctx context.Context, params *bedrockruntime.InvokeModelWithResponseStreamInput, optFns ...func(*bedrockruntime.Options)) (ResponseStream, error)
InvokeModelWithResponseStream invokes a model but unlike the InvokeModel method, the data returned is a stream of multiple events instead of a single response value. This function is the analogue of the bedrockruntime library InvokeModelWithResponseStream function, so that, given a bedrockruntime.Client b, where you would normally call the AWS method
response, err := b.InvokeModelWithResponseStream(c, p, f...)
You instead invoke the New Relic InvokeModelWithResponseStream function as:
rstream, err := nrbedrock.InvokeModelWithResponseStream(app, b, c, p, f...)
where app is your New Relic Application value.
If using the bedrockruntime library directly, you would then process the response stream value (the response variable in the above example), iterating over the provided channel where the stream data appears until it is exhausted, and then calling Close() on the stream (see the bedrock API documentation for details).
When using the New Relic nrawsbedrock integration, this response value is available as rstream.Response. You would perform the same operations as you would directly with the bedrock API once you have that value. Since this means control has passed back to your code for processing of the stream data, you need to add instrumentation calls to your processing code:
rstream.RecordEvent(content) // for each event received from the stream rstream.Close() // when you are finished and are going to close the stream
However, see ProcessModelWithResponseStream for an easier alternative.
Either start a transaction on your own and add it to the context c passed into this function, or a transaction will be started for you that lasts only for the duration of the model invocation.
func InvokeModelWithResponseStreamAttributes ¶
func InvokeModelWithResponseStreamAttributes(app *newrelic.Application, brc Modeler, ctx context.Context, params *bedrockruntime.InvokeModelWithResponseStreamInput, attrs map[string]any, optFns ...func(*bedrockruntime.Options)) (ResponseStream, error)
InvokeModelWithResponseStreamAttributes is identical to InvokeModelWithResponseStream except that it adds the attrs parameter, which is a map of strings to values of any type. This map holds any custom attributes you wish to add to the reported metrics relating to this model invocation.
Each key in the attrs map must begin with "llm."; if any of them do not, "llm." is automatically prepended to the attribute key before the metrics are sent out.
We recommend including at least "llm.conversation_id" in your attributes.
func (*ResponseStream) Close ¶
func (s *ResponseStream) Close() error
Close finishes up the instrumentation for a response stream.
func (*ResponseStream) RecordEvent ¶
func (s *ResponseStream) RecordEvent(data []byte) error
RecordEvent records a single stream event as read from the data stream started by InvokeModelWithStreamResponse.