graft

package module
v0.2.6 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 3, 2026 License: MIT Imports: 6 Imported by: 0

README

Graft

Go Reference Go Report Card CI

A Go framework for building AI agents and LLM-powered applications. Multi-provider support (OpenAI, Anthropic, Google Gemini, AWS Bedrock), type-safe tools via generics, agent handoffs, guardrails, MCP integration, and graph orchestration — all with zero vendor SDK dependencies.

Why Graft?

Graft LangChainGo Raw API calls
Vendor SDKs None — raw net/http Multiple SDKs You manage HTTP
Type safety Generic tools from Go structs Runtime casting Manual parsing
Agent handoffs Built-in, automatic Manual wiring DIY
Providers OpenAI, Anthropic, Gemini, Bedrock Varies by wrapper One at a time
Guardrails Input, output, and tool validation Limited None
MCP Client + server built-in Not available Not available
Graph orchestration LangGraph-style DAG execution Chain-based None
Durable execution Temporal, Hatchet, Trigger.dev Not available DIY
Dependencies Only OpenTelemetry 50+ transitive deps Depends

Graft is inspired by OpenAI Swarm — lightweight, composable, and designed for Go developers who want full control without framework lock-in.

Install

go get github.com/delavalom/graft

Quick Start

package main

import (
    "context"
    "fmt"
    "os"

    "github.com/delavalom/graft"
    "github.com/delavalom/graft/provider/openai"
)

func main() {
    model := openai.New(
        openai.WithAPIKey(os.Getenv("OPENROUTER_API_KEY")),
        openai.WithBaseURL("https://openrouter.ai/api/v1"),
        openai.WithModel("anthropic/claude-sonnet-4.6"),
    )

    greetTool := graft.NewTool("greet", "Greet someone by name",
        func(ctx context.Context, p struct {
            Name string `json:"name" description:"The person's name"`
        }) (string, error) {
            return fmt.Sprintf("Hello, %s!", p.Name), nil
        },
    )

    agent := graft.NewAgent("assistant",
        graft.WithInstructions("You are a helpful assistant. Use the greet tool when asked to greet someone."),
        graft.WithTools(greetTool),
    )

    runner := graft.NewDefaultRunner(model)
    result, err := runner.Run(context.Background(), agent, []graft.Message{
        {Role: graft.RoleUser, Content: "Please greet Alice"},
    })
    if err != nil {
        fmt.Fprintf(os.Stderr, "Error: %v\n", err)
        os.Exit(1)
    }
    fmt.Println(result.LastAssistantText())
}

Features

  • Multi-provider: OpenAI, Anthropic, Google (Gemini) — all via raw HTTP, no vendor SDKs
  • Type-safe tools: Define tools as typed Go functions with auto-generated JSON schemas
  • Agent handoffs: Route conversations between specialized agents
  • Guardrails: Input validation (max tokens, content filtering) and output validation (JSON schema)
  • MCP integration: Connect to MCP servers as a client, or expose graft tools as an MCP server
  • Graph orchestration: LangGraph-style DAG execution with conditional routing and streaming
  • Session persistence: Multi-turn conversations with memory and file-backed stores
  • Pluggable tracing: Braintrust, LangSmith, OpenTelemetry, or bring your own
  • Durable execution: Temporal, Hatchet, and Trigger.dev integrations
  • Streaming: SSE HTTP handler adapter for real-time responses
  • Provider routing: Fallback and round-robin strategies across providers

Architecture

User messages -> Agent -> Runner.Run() -> LanguageModel.Generate()
  -> Tool calls?  -> Execute tools -> Append results -> Loop back
  -> Handoff?     -> Switch active agent -> Loop back
  -> No tool calls? -> Return Result

Packages

Package Description
graft Core types: Agent, Runner, Tool, Message, Guardrail, Handoff, Hook
provider/openai OpenAI, OpenRouter, Ollama, LM Studio
provider/anthropic Anthropic Messages API
provider/google Google Generative Language API (Gemini)
provider/bedrock AWS Bedrock (Converse API) — Claude, Titan, Llama, Mistral
provider Router (fallback/round-robin) and middleware chain
guardrail Built-in guardrails: MaxTokens, ContentFilter, SchemaValidator
mcp Model Context Protocol client and server
graph Graph-based orchestration with conditional edges
state Session persistence (memory and file stores)
tracing Pluggable tracing: Braintrust, LangSmith, OpenTelemetry
temporal Temporal durable workflow integration
hatchet Hatchet durable function integration
trigger Trigger.dev REST API integration
stream SSE HTTP handler adapter
otel OpenTelemetry instrumentation wrappers

Examples

Example Description
basic Simple agent with a tool
handoff Agent-to-agent routing
streaming HTTP streaming with SSE
multi-provider Fallback across providers
guardrails Input/output validation
mcp-client Connect to an MCP server and use its tools
mcp-server Expose graft tools as an MCP server
graph ReAct graph orchestration
tracing Pluggable tracing with Braintrust
state Persistent multi-turn sessions
temporal Durable execution with Temporal
hatchet Durable functions with Hatchet
trigger Background tasks with Trigger.dev
bedrock AWS Bedrock with Converse API

Run any example:

export OPENROUTER_API_KEY=your-key
go run ./examples/basic/

Design Principles

Functional options everywhere: Consistent API across agents, providers, and runners.

agent := graft.NewAgent("name",
    graft.WithInstructions("..."),
    graft.WithTools(tool1, tool2),
    graft.WithGuardrails(guardrail.MaxTokens(1000)),
)

Type-safe tools: Struct fields become JSON schema automatically.

tool := graft.NewTool("search", "Search for items",
    func(ctx context.Context, p struct {
        Query string `json:"query" description:"Search query"`
        Limit int    `json:"limit" description:"Max results"`
    }) (string, error) {
        // ...
    },
)

Composable runners: Wrap runners to add behavior without changing the agent.

base := graft.NewDefaultRunner(model)
traced := tracing.NewTracedRunner(base, braintrustProvider)
persistent := state.NewSessionRunner(traced, store, sessionID)

Contributing

Contributions are welcome! See CONTRIBUTING.md for guidelines.

License

MIT

Documentation

Overview

Package graft is a Go framework for building AI agents and LLM-powered applications.

Graft provides type-safe tool definitions, multi-provider LLM support (OpenAI, Anthropic, Google Gemini, AWS Bedrock), agent handoffs, guardrails, MCP integration, graph orchestration, and durable execution — all with zero vendor SDK dependencies.

Quick Start

agent := graft.NewAgent("assistant",
    graft.WithInstructions("You are a helpful assistant."),
    graft.WithTools(myTool),
)
runner := graft.NewDefaultRunner(model)
result, err := runner.Run(ctx, agent, messages)

See https://github.com/delavalom/graft for full documentation and examples.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Agent

type Agent struct {
	Name         string
	Instructions string
	Tools        []Tool
	Model        string
	Temperature  *float64
	MaxTokens    *int
	ToolChoice   ToolChoice
	Guardrails   []Guardrail
	Handoffs     []Handoff
	SubAgents    []*SubAgent
	Hooks        *HookRegistry
	Metadata     map[string]any
}

func NewAgent

func NewAgent(name string, opts ...AgentOption) *Agent

NewAgent creates a new Agent with the given name and options.

type AgentError

type AgentError struct {
	Type    ErrorType
	Message string
	Cause   error
	Context map[string]any
}

func NewAgentError

func NewAgentError(errType ErrorType, message string, cause error) *AgentError

func NewProviderError

func NewProviderError(statusCode int, providerName string, body []byte) *AgentError

NewProviderError builds an *AgentError of type ErrProvider with structured ProviderError context. It maps common HTTP status codes to actionable guidance and attempts to parse provider-specific error messages from body.

func (*AgentError) Error

func (e *AgentError) Error() string

func (*AgentError) Is

func (e *AgentError) Is(target error) bool

func (*AgentError) IsRetryable

func (e *AgentError) IsRetryable() bool

IsRetryable reports whether the error is safe to retry. For non-provider errors it always returns false.

func (*AgentError) StatusCode

func (e *AgentError) StatusCode() int

StatusCode returns the HTTP status code stored inside a provider error, or 0 if this AgentError was not created from a provider HTTP response.

func (*AgentError) Unwrap

func (e *AgentError) Unwrap() error

type AgentOption

type AgentOption func(*Agent)

func WithGuardrails

func WithGuardrails(guardrails ...Guardrail) AgentOption

func WithHandoffs

func WithHandoffs(handoffs ...Handoff) AgentOption

func WithHooks

func WithHooks(hooks *HookRegistry) AgentOption

func WithInstructions

func WithInstructions(instructions string) AgentOption

func WithMaxTokens

func WithMaxTokens(max int) AgentOption

func WithMetadata

func WithMetadata(meta map[string]any) AgentOption

func WithModel

func WithModel(model string) AgentOption

func WithSubAgents

func WithSubAgents(subs ...*SubAgent) AgentOption

func WithTemperature

func WithTemperature(temp float64) AgentOption

func WithToolChoice

func WithToolChoice(tc ToolChoice) AgentOption

func WithTools

func WithTools(tools ...Tool) AgentOption

type Cost

type Cost struct {
	InputCostUSD  float64 `json:"input_cost_usd"`
	OutputCostUSD float64 `json:"output_cost_usd"`
}

func (Cost) TotalUSD

func (c Cost) TotalUSD() float64

type DefaultRunner

type DefaultRunner struct {
	// contains filtered or unexported fields
}

func NewDefaultRunner

func NewDefaultRunner(model LanguageModel) *DefaultRunner

func (*DefaultRunner) Run

func (r *DefaultRunner) Run(ctx context.Context, agent *Agent, messages []Message, opts ...RunOption) (*Result, error)

func (*DefaultRunner) RunStream

func (r *DefaultRunner) RunStream(ctx context.Context, agent *Agent, messages []Message, opts ...RunOption) (<-chan StreamEvent, error)

type ErrorType

type ErrorType string
const (
	ErrToolExecution   ErrorType = "tool_execution"
	ErrHandoff         ErrorType = "handoff"
	ErrGuardrail       ErrorType = "guardrail"
	ErrTimeout         ErrorType = "timeout"
	ErrContextLength   ErrorType = "context_length"
	ErrInvalidToolCall ErrorType = "invalid_tool_call"
	ErrRateLimit       ErrorType = "rate_limit"
	ErrProvider        ErrorType = "provider"
)

func (ErrorType) Error

func (e ErrorType) Error() string

type EventType

type EventType string
const (
	EventTextDelta      EventType = "text_delta"
	EventToolCallStart  EventType = "tool_call_start"
	EventToolCallDelta  EventType = "tool_call_delta"
	EventToolCallDone   EventType = "tool_call_done"
	EventToolResultDone EventType = "tool_result_done"
	EventMessageDone    EventType = "message_done"
	EventHandoff        EventType = "handoff"
	EventError          EventType = "error"
	EventDone           EventType = "done"
)

type GenerateParams

type GenerateParams struct {
	Messages    []Message        `json:"messages"`
	Tools       []ToolDefinition `json:"tools,omitempty"`
	Temperature *float64         `json:"temperature,omitempty"`
	MaxTokens   *int             `json:"max_tokens,omitempty"`
	ToolChoice  ToolChoice       `json:"tool_choice,omitempty"`
	Stop        []string         `json:"stop,omitempty"`
	Metadata    map[string]any   `json:"metadata,omitempty"`
}

GenerateParams holds the parameters for a generation request.

type GenerateResult

type GenerateResult struct {
	Message Message `json:"message"`
	Usage   Usage   `json:"usage"`
	Cost    *Cost   `json:"cost,omitempty"`
}

GenerateResult holds the result of a generation request.

type Guardrail

type Guardrail interface {
	Name() string
	Type() GuardrailType
	Validate(ctx context.Context, data *ValidationData) (*ValidationResult, error)
}

type GuardrailType

type GuardrailType string
const (
	GuardrailInput  GuardrailType = "input"
	GuardrailOutput GuardrailType = "output"
	GuardrailTool   GuardrailType = "tool"
)

type Handoff

type Handoff struct {
	Target      *Agent
	Description string
	Filter      func(ctx context.Context, messages []Message) bool
}

type HookCallback

type HookCallback func(ctx context.Context, payload *HookPayload) (*HookResult, error)

type HookEvent

type HookEvent string
const (
	HookAgentStart    HookEvent = "agent_start"
	HookAgentEnd      HookEvent = "agent_end"
	HookPreToolCall   HookEvent = "pre_tool_call"
	HookPostToolCall  HookEvent = "post_tool_call"
	HookToolCallError HookEvent = "tool_call_error"
	HookPreHandoff    HookEvent = "pre_handoff"
	HookPostHandoff   HookEvent = "post_handoff"
	HookPreGenerate   HookEvent = "pre_generate"
	HookPostGenerate  HookEvent = "post_generate"
	HookGuardrailTrip HookEvent = "guardrail_trip"
)

type HookPayload

type HookPayload struct {
	Event    HookEvent
	Agent    *Agent
	ToolCall *ToolCall
	Messages []Message
	Metadata map[string]any
}

type HookRegistry

type HookRegistry struct {
	// contains filtered or unexported fields
}

func NewHookRegistry

func NewHookRegistry() *HookRegistry

func (*HookRegistry) On

func (r *HookRegistry) On(event HookEvent, cb HookCallback)

func (*HookRegistry) Run

func (r *HookRegistry) Run(ctx context.Context, payload *HookPayload) (*HookResult, error)

type HookResult

type HookResult struct {
	Allow         *bool
	ModifiedInput []byte
	AdditionalCtx string
	SkipExecution bool
}

type LanguageModel

type LanguageModel interface {
	Generate(ctx context.Context, params GenerateParams) (*GenerateResult, error)
	Stream(ctx context.Context, params GenerateParams) (<-chan StreamChunk, error)
	ModelID() string
}

LanguageModel is the interface that all LLM providers must implement.

type Message

type Message struct {
	Role       Role           `json:"role"`
	Content    string         `json:"content,omitempty"`
	ToolCalls  []ToolCall     `json:"tool_calls,omitempty"`
	ToolResult *ToolResult    `json:"tool_result,omitempty"`
	Metadata   map[string]any `json:"metadata,omitempty"`
}

type ProviderError

type ProviderError struct {
	StatusCode   int
	ProviderCode string
	ProviderName string
	Retryable    bool
	RetryAfter   time.Duration
	Guidance     string
}

ProviderError holds structured information about an HTTP error from a provider.

type Result

type Result struct {
	Messages []Message `json:"messages"`
	Usage    Usage     `json:"usage"`
	Cost     *Cost     `json:"cost,omitempty"`
	Trace    *Trace    `json:"trace,omitempty"`
}

func (*Result) LastAssistantText

func (r *Result) LastAssistantText() string

type Role

type Role string
const (
	RoleSystem    Role = "system"
	RoleUser      Role = "user"
	RoleAssistant Role = "assistant"
	RoleTool      Role = "tool"
)

type RunConfig

type RunConfig struct {
	MaxIterations int
	ParallelTools bool
}

func DefaultRunConfig

func DefaultRunConfig() RunConfig

type RunOption

type RunOption func(*RunConfig)

func WithMaxIterations

func WithMaxIterations(n int) RunOption

func WithParallelTools

func WithParallelTools(enabled bool) RunOption

type Runner

type Runner interface {
	Run(ctx context.Context, agent *Agent, messages []Message, opts ...RunOption) (*Result, error)
	RunStream(ctx context.Context, agent *Agent, messages []Message, opts ...RunOption) (<-chan StreamEvent, error)
}

type Span

type Span struct {
	Name       string         `json:"name"`
	StartTime  time.Time      `json:"start_time"`
	Duration   time.Duration  `json:"duration"`
	Attributes map[string]any `json:"attributes,omitempty"`
	Children   []Span         `json:"children,omitempty"`
}

type StreamChunk

type StreamChunk struct {
	Delta StreamEvent `json:"delta"`
	Usage *Usage      `json:"usage,omitempty"`
}

StreamChunk holds a single chunk of a streaming response.

type StreamEvent

type StreamEvent struct {
	Type      EventType `json:"type"`
	Data      any       `json:"data"`
	AgentID   string    `json:"agent_id,omitempty"`
	Timestamp time.Time `json:"timestamp"`
}

type SubAgent

type SubAgent struct {
	Agent       *Agent
	Description string
	// InputMapper optionally transforms parent messages before passing them to the child agent.
	InputMapper func([]Message) []Message
}

SubAgent represents a child agent that can be invoked as a tool by a parent agent.

type SubAgentResult

type SubAgentResult struct {
	AgentName string
	Result    *Result
	Error     error
}

SubAgentResult holds the result of a subagent execution.

func RunSubAgent

func RunSubAgent(ctx context.Context, runner Runner, sub *SubAgent, messages []Message) (*SubAgentResult, error)

RunSubAgent runs a single subagent with the provided messages. If the SubAgent has an InputMapper, it is applied to the messages before execution. Context isolation is enforced: the child agent receives its own message slice.

func RunSubAgentsParallel

func RunSubAgentsParallel(ctx context.Context, runner Runner, subs []*SubAgent, messages []Message) ([]*SubAgentResult, error)

RunSubAgentsParallel runs multiple subagents concurrently and collects all results. All subagents are started simultaneously using a sync.WaitGroup. Errors from individual subagents are captured in their SubAgentResult and do not abort others.

type Tool

type Tool interface {
	Name() string
	Description() string
	Schema() json.RawMessage
	Execute(ctx context.Context, params json.RawMessage) (any, error)
}

func NewTool

func NewTool[P any, R any](name, description string, fn func(ctx context.Context, params P) (R, error)) Tool

type ToolCall

type ToolCall struct {
	ID        string          `json:"id"`
	Name      string          `json:"name"`
	Arguments json.RawMessage `json:"arguments"`
}

type ToolChoice

type ToolChoice string
const (
	ToolChoiceAuto     ToolChoice = "auto"
	ToolChoiceRequired ToolChoice = "required"
	ToolChoiceNone     ToolChoice = "none"
)

func ToolChoiceSpecific

func ToolChoiceSpecific(name string) ToolChoice

type ToolDefinition

type ToolDefinition struct {
	Name        string          `json:"name"`
	Description string          `json:"description"`
	Schema      json.RawMessage `json:"schema"`
}

func ToolDefFromTool

func ToolDefFromTool(t Tool) ToolDefinition

type ToolResult

type ToolResult struct {
	CallID  string `json:"call_id"`
	Content any    `json:"content"`
	IsError bool   `json:"is_error,omitempty"`
}

type Trace

type Trace struct {
	AgentID   string    `json:"agent_id"`
	StartTime time.Time `json:"start_time"`
	Spans     []Span    `json:"spans"`
}

func NewTrace

func NewTrace(agentID string) *Trace

func (*Trace) AddSpan

func (t *Trace) AddSpan(s Span)

type Usage

type Usage struct {
	PromptTokens     int `json:"prompt_tokens"`
	CompletionTokens int `json:"completion_tokens"`
}

func (Usage) TotalTokens

func (u Usage) TotalTokens() int

type ValidationData

type ValidationData struct {
	Messages   []Message
	ToolCall   *ToolCall
	ToolResult *ToolResult
	Agent      *Agent
}

type ValidationResult

type ValidationResult struct {
	Pass     bool
	Message  string
	Modified any
}

Directories

Path Synopsis
examples
basic command
bedrock command
graph command
guardrails command
handoff command
hatchet command
mcp-client command
mcp-server command
multi-provider command
state command
streaming command
temporal command
tracing command
trigger command
internal
Package provider implements LLM provider abstractions for the graft framework.
Package provider implements LLM provider abstractions for the graft framework.
anthropic
Package anthropic implements the Anthropic Messages API provider for the graft framework.
Package anthropic implements the Anthropic Messages API provider for the graft framework.
bedrock
Package bedrock implements the AWS Bedrock Converse API provider for the graft framework.
Package bedrock implements the AWS Bedrock Converse API provider for the graft framework.
google
Package google implements the Google Generative Language (Gemini) API provider for graft.
Package google implements the Google Generative Language (Gemini) API provider for graft.
openai
Package openai implements an OpenAI-compatible LLM provider for the graft framework.
Package openai implements an OpenAI-compatible LLM provider for the graft framework.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL