wire

package
v1.0.0-beta.66 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: May 12, 2026 License: MIT Imports: 11 Imported by: 0

Documentation

Overview

Package wire is the self-hosted JSON wire-format layer for LLM ChatCompletion calls. It owns request/response/streaming/error types and JSON round-tripping; transport, retry, registry, and provider-specific normalization remain in their existing homes.

The defining feature is the Extras map[string]json.RawMessage carrier on Message, ToolCall, Choice, and Function: any JSON field the typed structs don't know about is preserved on unmarshal and re-emitted on marshal. This is what lets us thread provider-specific fields like Gemini's extra_content.google.thought_signature through a multi-turn round-trip without losing them — the SDK's fixed types drop unknown fields silently.

The package is structurally OpenAI-shape (the lingua franca for ChatCompletion); adapters in processor/agentic-model/ translate agentic.ChatMessage to wire.Message and handle provider quirks.

See ADR-037 for the full motivation and migration plan.

Index

Constants

View Source
const DefaultMaxFrameSize = 16 * 1024 * 1024

DefaultMaxFrameSize is the default per-SSE-frame buffer ceiling. A single tool_call argument blob can exceed 1 MiB on long-arg flows; 16 MiB gives substantial headroom while still protecting against runaway provider output. Configurable via ClientConfig.MaxFrameSize.

Variables

This section is empty.

Functions

This section is empty.

Types

type APIError

type APIError struct {
	StatusCode int    `json:"-"`
	Type       string `json:"type,omitempty"`
	Code       string `json:"code,omitempty"`
	Message    string `json:"message,omitempty"`
	Param      string `json:"param,omitempty"`
	Status     string `json:"status,omitempty"`
}

APIError is the decoded error returned when a non-2xx HTTP status is received from a ChatCompletion call. The shape matches OpenAI's documented `error` object; fields a given provider doesn't set are left empty.

Code is a string for OpenAI-style error codes ("rate_limit_exceeded", "invalid_api_key") and tolerates numeric values from providers like Gemini that overload the field with the HTTP status (e.g. 400). Numeric codes are stringified on decode.

func DecodeError

func DecodeError(statusCode int, body []byte) *APIError

DecodeError parses an HTTP error body into an *APIError, tolerating two known shapes:

  • The standard object shape: {"error": {...}}
  • Gemini 3.x preview's array-wrapped variant: [{"error": {...}}] (the leading [ is sniffed and the first element peeled before decode)

statusCode is recorded on the returned error for callers that branch on it. body may be empty; in that case a synthetic error is returned describing the status code.

func (*APIError) Error

func (e *APIError) Error() string

Error implements error.

func (*APIError) UnmarshalJSON

func (e *APIError) UnmarshalJSON(data []byte) error

UnmarshalJSON tolerates numeric or string `code` values. Numeric codes are stringified.

type Accumulator

type Accumulator struct {

	// OnConflict is called for each Extras key overwrite during
	// streaming. Useful for surfacing silent-overwrite bugs without
	// failing the stream. nil disables.
	OnConflict func(scope, key string)
	// contains filtered or unexported fields
}

Accumulator merges streaming deltas into a final Message. The merge rules implement ADR-037 §2b:

  • Last-writer-wins on Extras key collisions, with a debug log line emitted via OnConflict if set.
  • No deep-merge on Extras: a later chunk replaces the prior value wholesale.
  • Per-tool_call merging keyed by Index; each ToolCall accumulates Function.Name, Function.Arguments (string concat), and Extras (last-writer-wins).

Use NewAccumulator + Add per chunk + Final at end-of-stream. The zero value is invalid; always go through NewAccumulator.

func NewAccumulator

func NewAccumulator() *Accumulator

NewAccumulator constructs a fresh Accumulator.

func (*Accumulator) Add

func (a *Accumulator) Add(chunk *StreamChunk) error

Add merges one StreamChunk into the accumulator. choices[0] is the only choice consumed; multi-choice streaming is rare and not supported in the v1 wire layer (callers wanting n>1 should use non-streaming completions).

func (*Accumulator) Final

func (a *Accumulator) Final() (msg Message, finishReason string, usage *Usage)

Final returns the accumulated Message, finish reason, and usage. finishReason is empty if no chunk supplied one. usage is nil if no chunk reported it.

type ChatCompletionRequest

type ChatCompletionRequest struct {
	Model            string                     `json:"model"`
	Messages         []Message                  `json:"messages"`
	Tools            []Tool                     `json:"tools,omitempty"`
	ToolChoice       json.RawMessage            `json:"tool_choice,omitempty"`
	ResponseFormat   *ResponseFormat            `json:"response_format,omitempty"`
	Stream           bool                       `json:"stream,omitempty"`
	StreamOptions    *StreamOptions             `json:"stream_options,omitempty"`
	MaxTokens        int                        `json:"max_tokens,omitempty"`
	Temperature      *float64                   `json:"temperature,omitempty"`
	TopP             *float64                   `json:"top_p,omitempty"`
	N                int                        `json:"n,omitempty"`
	Stop             []string                   `json:"stop,omitempty"`
	PresencePenalty  *float64                   `json:"presence_penalty,omitempty"`
	FrequencyPenalty *float64                   `json:"frequency_penalty,omitempty"`
	User             string                     `json:"user,omitempty"`
	Seed             *int64                     `json:"seed,omitempty"`
	ReasoningEffort  string                     `json:"reasoning_effort,omitempty"`
	Extras           map[string]json.RawMessage `json:"-"`
}

ChatCompletionRequest is the wire-shape ChatCompletion request body. Field names match OpenAI's documented chat-completions API. Any provider-specific fields not modeled here are carried in Extras.

func (ChatCompletionRequest) MarshalJSON

func (r ChatCompletionRequest) MarshalJSON() ([]byte, error)

MarshalJSON implements json.Marshaler with Extras merge.

func (*ChatCompletionRequest) UnmarshalJSON

func (r *ChatCompletionRequest) UnmarshalJSON(data []byte) error

UnmarshalJSON implements json.Unmarshaler with Extras split.

type ChatCompletionResponse

type ChatCompletionResponse struct {
	ID                string                     `json:"id"`
	Object            string                     `json:"object,omitempty"`
	Created           int64                      `json:"created,omitempty"`
	Model             string                     `json:"model"`
	Choices           []Choice                   `json:"choices"`
	Usage             *Usage                     `json:"usage,omitempty"`
	SystemFingerprint string                     `json:"system_fingerprint,omitempty"`
	Extras            map[string]json.RawMessage `json:"-"`
}

ChatCompletionResponse is the non-streaming response shape.

func (ChatCompletionResponse) MarshalJSON

func (r ChatCompletionResponse) MarshalJSON() ([]byte, error)

MarshalJSON implements json.Marshaler with Extras merge.

func (*ChatCompletionResponse) UnmarshalJSON

func (r *ChatCompletionResponse) UnmarshalJSON(data []byte) error

UnmarshalJSON implements json.Unmarshaler with Extras split.

type Choice

type Choice struct {
	Index        int                        `json:"index"`
	Message      *Message                   `json:"message,omitempty"`
	Delta        *Message                   `json:"delta,omitempty"`
	FinishReason string                     `json:"finish_reason,omitempty"`
	Logprobs     json.RawMessage            `json:"logprobs,omitempty"`
	Extras       map[string]json.RawMessage `json:"-"`
}

Choice is a single response candidate.

func (Choice) MarshalJSON

func (c Choice) MarshalJSON() ([]byte, error)

MarshalJSON implements json.Marshaler with Extras merge.

func (*Choice) UnmarshalJSON

func (c *Choice) UnmarshalJSON(data []byte) error

UnmarshalJSON implements json.Unmarshaler with Extras split.

type Client

type Client struct {
	// contains filtered or unexported fields
}

Client is the OpenAI-compat ChatCompletion / Embeddings client. It is safe for concurrent use; callers typically construct one Client per endpoint and reuse it across requests.

func NewClient

func NewClient(cfg ClientConfig) (*Client, error)

NewClient validates cfg and returns a Client. Returns an error if HTTPClient or BaseURL is missing.

func (*Client) ChatCompletion

func (c *Client) ChatCompletion(ctx context.Context, req *ChatCompletionRequest) (*ChatCompletionResponse, error)

ChatCompletion executes a non-streaming ChatCompletion call. Returns a decoded response on HTTP 2xx, or *APIError with the status code recorded on any other status. Network and decode errors return as regular errors.

func (*Client) ChatCompletionStream

func (c *Client) ChatCompletionStream(ctx context.Context, req *ChatCompletionRequest) (*Stream, error)

ChatCompletionStream executes a streaming ChatCompletion call. Returns a *Stream the caller drives via Recv until io.EOF; callers MUST call Close when done. On non-2xx the response body is read fully and returned as *APIError; the *Stream is nil in that case.

func (*Client) Embeddings

func (c *Client) Embeddings(ctx context.Context, req *EmbeddingsRequest) (*EmbeddingsResponse, error)

Embeddings executes a non-streaming Embeddings call.

type ClientConfig

type ClientConfig struct {
	// BaseURL is the OpenAI-compat root, e.g. "https://api.openai.com/v1".
	// If ChatCompletionsURL or EmbeddingsURL are set, they take
	// precedence for their respective endpoints.
	BaseURL string

	// HTTPClient is the transport. Callers should use
	// model.NewHTTPClient to construct one with the framework's
	// connection-hygiene defaults.
	HTTPClient *http.Client

	// AuthHeader, if non-empty, is set as the Authorization header on
	// every request. Typically "Bearer <key>".
	AuthHeader string

	// ExtraHeaders are added to every request before send. Useful for
	// provider-specific headers like "x-api-key" or "anthropic-version".
	ExtraHeaders http.Header

	// ChatCompletionsURL overrides BaseURL+/chat/completions. Empty
	// means derive from BaseURL.
	ChatCompletionsURL string

	// EmbeddingsURL overrides BaseURL+/embeddings.
	EmbeddingsURL string

	// MaxFrameSize caps the per-SSE-frame buffer. 0 uses
	// DefaultMaxFrameSize (16 MiB). Bump for providers that emit
	// outsized tool_call argument blobs in a single frame.
	MaxFrameSize int
}

ClientConfig configures a Client. BaseURL and HTTPClient are required; everything else is optional.

type ContentImageURL

type ContentImageURL struct {
	URL    string `json:"url"`
	Detail string `json:"detail,omitempty"`
}

ContentImageURL is the image-url subshape of a content part.

type ContentPart

type ContentPart struct {
	Type     string                     `json:"type"`
	Text     string                     `json:"text,omitempty"`
	ImageURL *ContentImageURL           `json:"image_url,omitempty"`
	Extras   map[string]json.RawMessage `json:"-"`
}

ContentPart is one element of a content-array message (e.g. text, image_url). The shape is provider-defined; Extras preserves any fields not in the canonical OpenAI subset.

func (ContentPart) MarshalJSON

func (p ContentPart) MarshalJSON() ([]byte, error)

MarshalJSON implements json.Marshaler with Extras merge.

func (*ContentPart) UnmarshalJSON

func (p *ContentPart) UnmarshalJSON(data []byte) error

UnmarshalJSON implements json.Unmarshaler with Extras split.

type EmbeddingDatum

type EmbeddingDatum struct {
	Object    string    `json:"object,omitempty"`
	Index     int       `json:"index"`
	Embedding []float32 `json:"embedding"`
}

EmbeddingDatum is one vector returned by the embeddings endpoint.

type EmbeddingsRequest

type EmbeddingsRequest struct {
	Model          string                     `json:"model"`
	Input          json.RawMessage            `json:"input"`
	EncodingFormat string                     `json:"encoding_format,omitempty"`
	Dimensions     int                        `json:"dimensions,omitempty"`
	User           string                     `json:"user,omitempty"`
	Extras         map[string]json.RawMessage `json:"-"`
}

EmbeddingsRequest is the wire-shape Embeddings request body. Used by graph/embedding once the embeddings migration chunk lands.

func (EmbeddingsRequest) MarshalJSON

func (r EmbeddingsRequest) MarshalJSON() ([]byte, error)

MarshalJSON implements json.Marshaler with Extras merge.

func (*EmbeddingsRequest) UnmarshalJSON

func (r *EmbeddingsRequest) UnmarshalJSON(data []byte) error

UnmarshalJSON implements json.Unmarshaler with Extras split.

type EmbeddingsResponse

type EmbeddingsResponse struct {
	Object string                     `json:"object,omitempty"`
	Model  string                     `json:"model"`
	Data   []EmbeddingDatum           `json:"data"`
	Usage  *Usage                     `json:"usage,omitempty"`
	Extras map[string]json.RawMessage `json:"-"`
}

EmbeddingsResponse is the embeddings response.

func (EmbeddingsResponse) MarshalJSON

func (r EmbeddingsResponse) MarshalJSON() ([]byte, error)

MarshalJSON implements json.Marshaler with Extras merge.

func (*EmbeddingsResponse) UnmarshalJSON

func (r *EmbeddingsResponse) UnmarshalJSON(data []byte) error

UnmarshalJSON implements json.Unmarshaler with Extras split.

type Function

type Function struct {
	Name      string                     `json:"name,omitempty"`
	Arguments string                     `json:"arguments,omitempty"`
	Extras    map[string]json.RawMessage `json:"-"`
}

Function describes a function call's name and arguments. Arguments is a JSON-encoded string per OpenAI's wire shape.

func (Function) MarshalJSON

func (f Function) MarshalJSON() ([]byte, error)

MarshalJSON implements json.Marshaler with Extras merge.

func (*Function) UnmarshalJSON

func (f *Function) UnmarshalJSON(data []byte) error

UnmarshalJSON implements json.Unmarshaler with Extras split.

type FunctionDefinition

type FunctionDefinition struct {
	Name        string          `json:"name"`
	Description string          `json:"description,omitempty"`
	Parameters  json.RawMessage `json:"parameters,omitempty"`
	Strict      bool            `json:"strict,omitempty"`
}

FunctionDefinition is the schema for a tool function.

type JSONSchema

type JSONSchema struct {
	Name        string          `json:"name"`
	Description string          `json:"description,omitempty"`
	Schema      json.RawMessage `json:"schema,omitempty"`
	Strict      bool            `json:"strict,omitempty"`
}

JSONSchema is the schema container for a response_format with type=json_schema.

type Message

type Message struct {
	Role             string                     `json:"role"`
	Content          json.RawMessage            `json:"content,omitempty"`
	Name             string                     `json:"name,omitempty"`
	ReasoningContent string                     `json:"reasoning_content,omitempty"`
	ToolCalls        []ToolCall                 `json:"tool_calls,omitempty"`
	ToolCallID       string                     `json:"tool_call_id,omitempty"`
	Refusal          string                     `json:"refusal,omitempty"`
	Extras           map[string]json.RawMessage `json:"-"`
}

Message is a single message in a chat conversation.

Content is json.RawMessage to honor OpenAI's polymorphism: it may be a JSON string ("hello") or a JSON array of content parts ([{...}]). Use Content.IsString / AsString / AsParts helpers to consume it.

func (Message) ContentParts

func (m Message) ContentParts() (parts []ContentPart, ok bool, err error)

ContentParts returns the message's content as a list of typed parts when it is a JSON array. ok is false if the content is empty, a string, or otherwise non-array. The error is non-nil only when the content is array-shaped but malformed.

func (Message) ContentString

func (m Message) ContentString() (s string, ok bool)

ContentString returns the message's content as a plain string when it is a JSON string. ok is false if the content is empty, an array, or otherwise non-string.

func (Message) MarshalJSON

func (m Message) MarshalJSON() ([]byte, error)

MarshalJSON implements json.Marshaler with Extras merge.

func (*Message) SetContentString

func (m *Message) SetContentString(s string) error

SetContentString writes a plain JSON string into Content.

func (*Message) UnmarshalJSON

func (m *Message) UnmarshalJSON(data []byte) error

UnmarshalJSON implements json.Unmarshaler with Extras split.

type ResponseFormat

type ResponseFormat struct {
	Type       string      `json:"type"`
	JSONSchema *JSONSchema `json:"json_schema,omitempty"`
}

ResponseFormat constrains the output shape (ADR-034).

type Stream

type Stream struct {
	// contains filtered or unexported fields
}

Stream consumes a Server-Sent-Events response body, decoding each `data:` frame into a StreamChunk. The caller iterates by calling Recv until io.EOF.

Stream is NOT safe for concurrent use; one goroutine drives a stream from start to finish, then calls Close.

func (*Stream) Close

func (s *Stream) Close() error

Close releases the underlying response body. Safe to call multiple times.

func (*Stream) Recv

func (s *Stream) Recv() (*StreamChunk, error)

Recv returns the next decoded chunk. Returns io.EOF when the stream has emitted the [DONE] sentinel or the underlying body has ended. Subsequent calls after EOF continue to return io.EOF.

type StreamChunk

type StreamChunk struct {
	ID                string                     `json:"id,omitempty"`
	Object            string                     `json:"object,omitempty"`
	Created           int64                      `json:"created,omitempty"`
	Model             string                     `json:"model,omitempty"`
	Choices           []Choice                   `json:"choices"`
	Usage             *Usage                     `json:"usage,omitempty"`
	SystemFingerprint string                     `json:"system_fingerprint,omitempty"`
	Extras            map[string]json.RawMessage `json:"-"`
}

StreamChunk is a single Server-Sent-Events frame from a streaming ChatCompletion. Choices carry deltas; the final frame typically has a finish_reason set on each choice.

func (StreamChunk) MarshalJSON

func (s StreamChunk) MarshalJSON() ([]byte, error)

MarshalJSON implements json.Marshaler with Extras merge.

func (*StreamChunk) UnmarshalJSON

func (s *StreamChunk) UnmarshalJSON(data []byte) error

UnmarshalJSON implements json.Unmarshaler with Extras split.

type StreamOptions

type StreamOptions struct {
	IncludeUsage bool `json:"include_usage,omitempty"`
}

StreamOptions controls server-side streaming behavior.

type Tool

type Tool struct {
	Type     string             `json:"type"`
	Function FunctionDefinition `json:"function"`
}

Tool is a tool definition available to the model.

type ToolCall

type ToolCall struct {
	Index    *int                       `json:"index,omitempty"`
	ID       string                     `json:"id,omitempty"`
	Type     string                     `json:"type,omitempty"`
	Function Function                   `json:"function"`
	Extras   map[string]json.RawMessage `json:"-"`
}

ToolCall represents a single function/tool invocation.

func (ToolCall) MarshalJSON

func (t ToolCall) MarshalJSON() ([]byte, error)

MarshalJSON implements json.Marshaler with Extras merge.

func (*ToolCall) UnmarshalJSON

func (t *ToolCall) UnmarshalJSON(data []byte) error

UnmarshalJSON implements json.Unmarshaler with Extras split.

type Usage

type Usage struct {
	PromptTokens     int `json:"prompt_tokens"`
	CompletionTokens int `json:"completion_tokens"`
	TotalTokens      int `json:"total_tokens"`
}

Usage reports prompt + completion token counts.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL