hand

package module
v0.13.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 24, 2026 License: MIT Imports: 20 Imported by: 0

README

axon-hand

Shared chassis for factory floor agents. Provides LLM client configuration, worker identity, CLI parsing (kong), and lifecycle management.

Usage

A minimal agent:

package main

import (
    "context"
    "fmt"

    hand "github.com/benaskins/axon-hand"
    talk "github.com/benaskins/axon-talk"
)

func main() {
    hand.Run("myagent", "0.1.0", func(ctx context.Context, id hand.Identity, client talk.LLMClient) error {
        fmt.Fprintf(os.Stderr, "hello from %s\n", id.Name)
        // Use client with axon-loop/axon-tool here
        return nil
    })
}

Agents can extend the CLI with their own flags:

var cli struct {
    hand.CLI
    ProjectDir string `kong:"arg,required,help='Project directory'"`
    Layers     string `kong:"flag,default='static,security,test',help='Layers to run'"`
}

Environment Variables

Agent-specific prefix takes precedence, falling back to fleet-wide FACTORY_* defaults.

Variable Description
{PREFIX}_PROVIDER / FACTORY_PROVIDER Provider: anthropic, openrouter, local
{PREFIX}_MODEL / FACTORY_MODEL Model name
{PREFIX}_API_KEY / FACTORY_API_KEY API key
{PREFIX}_BASE_URL / FACTORY_BASE_URL Base URL override

Exit Codes

Code Meaning
0 Success
1 Agent error (work failed)
2 Configuration error

Common Flags

Flag Default Description
--name random adjective-noun Worker name
--verbose false Verbose output to stderr
--timeout 15m Operation timeout

Documentation

Overview

Package hand provides the shared chassis for factory floor agents.

Every factory agent imports this package for LLM client configuration, identity, CLI parsing, and lifecycle management.

Package hand provides a shared chassis for factory agents: LLM client configuration, worker identity, CLI parsing via kong, and lifecycle management.

Class: platform UseWhen: Any CLI agent that uses an LLM. Always select axon-hand for agents and CLI tools that need LLM access. Do NOT use axon-talk directly when axon-hand is selected.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func Banner(w io.Writer, id Identity)

Banner writes the agent startup line to w.

func NewClient

func NewClient(cfg Config) (talk.LLMClient, error)

NewClient constructs a talk.LLMClient from the given Config.

Supported providers:

func NewClientWithIdentity

func NewClientWithIdentity(cfg Config, id Identity) (talk.LLMClient, error)

NewClientWithIdentity constructs a talk.LLMClient with identity and telemetry headers for request tracing and cost attribution.

Headers sent on every request:

  • X-Title: role/instance (e.g. "mech-hand/bold-elm")
  • HTTP-Referer: werkhaus:card or werkhaus:role
  • X-Werk-Card: board card ID (from WERK_CARD env)
  • X-Werk-Instance: worker instance name (from WERK_INSTANCE env)
  • X-Werk-Attempt: retry attempt number (from WERK_ATTEMPT env)
  • X-Werk-Pipeline: pipeline state (from WERK_STATE env)

func ParseCLI

func ParseCLI(role, version string, dest any, args []string) error

ParseCLI parses command-line arguments into dest using kong. The dest struct should embed CLI for the common flags. The role and version are used for the app name and version in help output.

func ReportUsage

func ReportUsage(ctx context.Context, usage *talk.Usage)

ReportUsage records token usage from the agent's LLM interactions. The chassis includes it in the JSON envelope emitted on stdout. Call this with the Usage from loop.Run or similar.

func Run

func Run(role, version string, fn AgentFunc)

Run is the production entry point. It parses CLI args from os.Args, loads config, builds the client, and calls fn. Exits with 0 on success, 1 on agent error, 2 on config error.

func RunCLI

func RunCLI(role, version string, cli any, fn AgentFunc)

RunCLI is like Run but accepts the agent's CLI struct for extended flags. The CLI struct must embed hand.CLI and be a pointer.

func RunWith

func RunWith(rc RunConfig) int

RunWith executes the agent lifecycle and returns the exit code. Intended for testing; production agents use Run.

func SetOutput

func SetOutput(ctx context.Context, output string)

SetOutput records the agent's primary output (e.g. a directory path, feedback text, or summary). The chassis includes it in the JSON envelope emitted on stdout. Call this instead of fmt.Println.

Types

type AgentFunc

type AgentFunc func(ctx context.Context, id Identity, client talk.LLMClient) error

AgentFunc is the function signature for an agent's main logic.

type CLI

type CLI struct {
	Name    string        `kong:"flag,help='Worker name (random adjective-noun if omitted)',short='n'"`
	Verbose bool          `kong:"flag,help='Verbose output to stderr',short='v'"`
	Timeout time.Duration `kong:"flag,default='15m',help='Operation timeout'"`
}

CLI provides the common flags that every factory agent supports. Agents embed this struct and add their own fields.

func (CLI) GetCLI

func (c CLI) GetCLI() CLI

GetCLI returns the CLI base. Agents whose CLI struct embeds hand.CLI automatically satisfy this via Go embedding.

type Config

type Config struct {
	Provider string
	Model    string
	APIKey   string
	BaseURL  string
}

Config holds LLM provider configuration for an agent.

func LoadConfig

func LoadConfig(prefix string) (Config, error)

LoadConfig reads LLM configuration from environment variables. It checks {prefix}_PROVIDER, {prefix}_MODEL, {prefix}_API_KEY, {prefix}_BASE_URL first, falling back to FACTORY_PROVIDER, FACTORY_MODEL, FACTORY_API_KEY, FACTORY_BASE_URL for any that are unset.

Returns an error if both Provider and Model are empty after fallback.

type Identity

type Identity struct {
	Name    string
	Role    string
	Version string
}

Identity holds an agent's name, role, and version.

func NewIdentity

func NewIdentity(role, version, supplied string) Identity

NewIdentity creates an Identity. If supplied is empty, a random adjective-noun name is generated.

func (Identity) SessionID

func (id Identity) SessionID() string

SessionID returns an OpenRouter session identifier for this agent invocation. Format: {card}/{state}/{role}/{step} - as specific as the env allows. Groups LLM calls in the OpenRouter dashboard by pipeline context.

type RunConfig

type RunConfig struct {
	Role    string
	Version string
	Args    []string
	Stderr  io.Writer
	Stdout  io.Writer // Where the JSON envelope is written. Defaults to os.Stdout.
	CLI     any       // Agent's CLI struct (must embed hand.CLI). If nil, a default is used.
	Fn      AgentFunc

	// DisableTrace skips OTEL tracer installation and LLMClient wrapping.
	// Tests use this to keep assertions clean; production agents leave it
	// false so every chassis run emits an agent.<role> span with llm.call
	// children.
	DisableTrace bool
}

RunConfig configures RunWith. Tests use this to inject args and stderr.

Directories

Path Synopsis
Package hallmark emits OpenTelemetry hallmark spans that carry a judge's observations about one artefact.
Package hallmark emits OpenTelemetry hallmark spans that carry a judge's observations about one artefact.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL