observability

package
v0.0.0-...-5fe9b4e Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 28, 2026 License: MIT Imports: 6 Imported by: 0

Documentation

Overview

Package observability provides instrumentation for metrics, tracing, and logging.

Index

Constants

This section is empty.

Variables

View Source
var (
	// RequestsTotal counts total LLM requests by provider, model, endpoint, and status
	RequestsTotal = promauto.NewCounterVec(
		prometheus.CounterOpts{
			Name: "gomodel_requests_total",
			Help: "Total number of LLM requests",
		},
		[]string{"provider", "model", "endpoint", "status_code", "status_type", "stream"},
	)

	// RequestDuration measures request latency distribution
	// For streaming requests, this measures time to stream establishment, not total stream duration
	RequestDuration = promauto.NewHistogramVec(
		prometheus.HistogramOpts{
			Name:    "gomodel_request_duration_seconds",
			Help:    "LLM request duration in seconds",
			Buckets: []float64{0.1, 0.25, 0.5, 1, 2, 5, 10, 30, 60},
		},
		[]string{"provider", "model", "endpoint", "stream"},
	)

	// InFlightRequests tracks concurrent requests per provider
	InFlightRequests = promauto.NewGaugeVec(
		prometheus.GaugeOpts{
			Name: "gomodel_requests_in_flight",
			Help: "Number of LLM requests currently in flight",
		},
		[]string{"provider", "endpoint", "stream"},
	)

	// ResponseSnapshotStoreFailures counts failures while storing response snapshots.
	ResponseSnapshotStoreFailures = promauto.NewCounterVec(
		prometheus.CounterOpts{
			Name: "gomodel_response_snapshot_store_failures_total",
			Help: "Total number of response snapshot store failures",
		},
		[]string{"provider", "provider_name", "operation"},
	)
)

Prometheus metrics for LLM gateway observability

Functions

func HealthCheck

func HealthCheck() error

HealthCheck verifies that metrics are being collected

func NewPrometheusHooks

func NewPrometheusHooks() llmclient.Hooks

NewPrometheusHooks returns hooks that instrument LLM requests with Prometheus metrics. These hooks can be injected into llmclient.Config to enable observability without polluting business logic.

func ResetMetrics

func ResetMetrics()

ResetMetrics resets all metrics to zero (useful for testing)

Types

type PrometheusMetrics

type PrometheusMetrics struct {
	RequestsTotal                 *prometheus.CounterVec
	RequestDuration               *prometheus.HistogramVec
	InFlightRequests              *prometheus.GaugeVec
	ResponseSnapshotStoreFailures *prometheus.CounterVec
}

PrometheusMetrics provides access to all registered metrics for testing

func GetMetrics

func GetMetrics() *PrometheusMetrics

GetMetrics returns the prometheus metrics for testing and introspection

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL