Documentation
¶
Overview ¶
Package hand provides the shared chassis for factory floor agents.
Every factory agent imports this package for LLM client configuration, identity, CLI parsing, and lifecycle management.
Package hand provides a shared chassis for factory agents: LLM client configuration, worker identity, CLI parsing via kong, and lifecycle management.
Class: platform UseWhen: Any CLI agent that uses an LLM. Always select axon-hand for agents and CLI tools that need LLM access. Do NOT use axon-talk directly when axon-hand is selected.
Index ¶
- func Banner(w io.Writer, id Identity)
- func NewClient(cfg Config) (talk.LLMClient, error)
- func NewClientWithIdentity(cfg Config, id Identity) (talk.LLMClient, error)
- func ParseCLI(role, version string, dest any, args []string) error
- func ReportUsage(ctx context.Context, usage *talk.Usage)
- func Run(role, version string, fn AgentFunc)
- func RunCLI(role, version string, cli any, fn AgentFunc)
- func RunWith(rc RunConfig) int
- func SetOutput(ctx context.Context, output string)
- type AgentFunc
- type CLI
- type Config
- type Identity
- type RunConfig
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func NewClient ¶
NewClient constructs a talk.LLMClient from the given Config.
Supported providers:
- "anthropic": Anthropic API (default BaseURL: https://api.anthropic.com)
- "openrouter": OpenRouter (default BaseURL: https://openrouter.ai/api)
- "local": OpenAI-compatible local server (default BaseURL: http://localhost:11434)
func NewClientWithIdentity ¶
NewClientWithIdentity constructs a talk.LLMClient with identity and telemetry headers for request tracing and cost attribution.
Headers sent on every request:
- X-Title: role/instance (e.g. "mech-hand/bold-elm")
- HTTP-Referer: werkhaus:card or werkhaus:role
- X-Werk-Card: board card ID (from WERK_CARD env)
- X-Werk-Instance: worker instance name (from WERK_INSTANCE env)
- X-Werk-Attempt: retry attempt number (from WERK_ATTEMPT env)
- X-Werk-Pipeline: pipeline state (from WERK_STATE env)
func ParseCLI ¶
ParseCLI parses command-line arguments into dest using kong. The dest struct should embed CLI for the common flags. The role and version are used for the app name and version in help output.
func ReportUsage ¶
ReportUsage records token usage from the agent's LLM interactions. The chassis includes it in the JSON envelope emitted on stdout. Call this with the Usage from loop.Run or similar.
func Run ¶
Run is the production entry point. It parses CLI args from os.Args, loads config, builds the client, and calls fn. Exits with 0 on success, 1 on agent error, 2 on config error.
func RunCLI ¶
RunCLI is like Run but accepts the agent's CLI struct for extended flags. The CLI struct must embed hand.CLI and be a pointer.
Types ¶
type CLI ¶
type CLI struct {
Name string `kong:"flag,help='Worker name (random adjective-noun if omitted)',short='n'"`
Verbose bool `kong:"flag,help='Verbose output to stderr',short='v'"`
Timeout time.Duration `kong:"flag,default='15m',help='Operation timeout'"`
}
CLI provides the common flags that every factory agent supports. Agents embed this struct and add their own fields.
type Config ¶
Config holds LLM provider configuration for an agent.
func LoadConfig ¶
LoadConfig reads LLM configuration from environment variables. It checks {prefix}_PROVIDER, {prefix}_MODEL, {prefix}_API_KEY, {prefix}_BASE_URL first, falling back to FACTORY_PROVIDER, FACTORY_MODEL, FACTORY_API_KEY, FACTORY_BASE_URL for any that are unset.
Returns an error if both Provider and Model are empty after fallback.
type Identity ¶
Identity holds an agent's name, role, and version.
func NewIdentity ¶
NewIdentity creates an Identity. If supplied is empty, a random adjective-noun name is generated.
type RunConfig ¶
type RunConfig struct {
Role string
Version string
Args []string
Stderr io.Writer
Stdout io.Writer // Where the JSON envelope is written. Defaults to os.Stdout.
CLI any // Agent's CLI struct (must embed hand.CLI). If nil, a default is used.
Fn AgentFunc
// DisableTrace skips OTEL tracer installation and LLMClient wrapping.
// Tests use this to keep assertions clean; production agents leave it
// false so every chassis run emits an agent.<role> span with llm.call
// children.
DisableTrace bool
}
RunConfig configures RunWith. Tests use this to inject args and stderr.