chit

package module
v0.0.1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 20, 2026 License: MIT Imports: 7 Imported by: 0

README

chit

CI Status codecov Go Report Card CodeQL Go Reference License Go Version Release

Conversation lifecycle controller for LLM-powered applications.

Manage user conversations, orchestrate pluggable processors, and stream responses — while keeping internal reasoning separate from what users see.

The Conversation Belongs to You

Your processor receives input and read-only history. What it does with an LLM is its own business.

// Processor handles reasoning — how it talks to LLMs is an implementation detail
processor := chit.ProcessorFunc(func(ctx context.Context, input string, history []chit.Message) (chit.Result, error) {
    // history is the user-facing conversation (read-only)
    // Internal LLM calls, chain-of-thought, tool use — none of it pollutes history
    response := callYourLLM(input, history)
    return &chit.Response{Content: response}, nil
})

// Emitter streams output to the user
emitter := &StreamingEmitter{writer: w}

// Chat manages the lifecycle
chat := chit.New(processor, emitter)

// Handle user input — chit manages history, you manage reasoning
chat.Handle(ctx, "What's the weather in Tokyo?")

The user sees a clean conversation. Your processor can have elaborate internal dialogues with the LLM — retries, tool calls, multi-step reasoning — and none of it leaks through.

Install

go get github.com/zoobz-io/chit

Requires Go 1.24+.

Quick Start

package main

import (
    "context"
    "fmt"

    "github.com/zoobz-io/chit"
)

// SimpleEmitter collects messages for demonstration
type SimpleEmitter struct {
    Messages []chit.Message
}

func (e *SimpleEmitter) Emit(_ context.Context, msg chit.Message) error {
    e.Messages = append(e.Messages, msg)
    fmt.Printf("[%s]: %s\n", msg.Role, msg.Content)
    return nil
}
func (e *SimpleEmitter) Push(_ context.Context, _ chit.Resource) error { return nil }
func (e *SimpleEmitter) Close() error                                  { return nil }

func main() {
    // Processor returns responses or yields for multi-turn
    processor := chit.ProcessorFunc(func(_ context.Context, input string, history []chit.Message) (chit.Result, error) {
        // First message — ask for clarification
        if len(history) == 1 {
            return &chit.Yield{
                Prompt: "Which city would you like weather for?",
                Continuation: func(_ context.Context, city string, _ []chit.Message) (chit.Result, error) {
                    return &chit.Response{Content: fmt.Sprintf("Weather in %s: Sunny, 22°C", city)}, nil
                },
            }, nil
        }
        return &chit.Response{Content: "Hello! How can I help?"}, nil
    })

    emitter := &SimpleEmitter{}
    chat := chit.New(processor, emitter)

    // First call yields, asking for more info
    chat.Handle(context.Background(), "What's the weather?")
    // [assistant]: Which city would you like weather for?

    // Second call resumes with the answer
    chat.Handle(context.Background(), "Tokyo")
    // [assistant]: Weather in Tokyo: Sunny, 22°C

    // History tracked automatically
    fmt.Printf("Conversation has %d messages\n", len(chat.History()))
}

Capabilities

Feature Description Docs
Processor Interface Pluggable reasoning with read-only history Concepts
Yield & Continue Multi-turn conversations via continuations Concepts
Pipeline Resilience Retry, timeout, circuit breaker via pipz Reliability
Emitter Abstraction Stream responses, push resources Architecture
Signal Observability Lifecycle events via capitan Architecture
Testing Utilities Mock processors and emitters Testing

Why chit?

  • Clean separation — User conversation stays clean; internal LLM reasoning is processor's business
  • Turn-taking built in — Yield/Continue pattern for multi-turn without manual state management
  • Pipeline-native — Wrap with pipz for retry, timeout, rate limiting, circuit breakers
  • Observable — Lifecycle signals via capitan without instrumentation code
  • Bring your own LLM — Processor interface works with any LLM client or framework

Bring Your Own Reasoning

Chit manages the conversation lifecycle. How you reason is up to you.

// Use zyn for typed LLM interactions
processor := chit.ProcessorFunc(func(ctx context.Context, input string, history []chit.Message) (chit.Result, error) {
    // Create internal session — separate from user history
    session := zyn.NewSession()
    session.Append(zyn.RoleSystem, "You are a helpful assistant.")

    // Add user context
    for _, msg := range history {
        session.Append(zyn.Role(msg.Role), msg.Content)
    }

    // Call LLM — internal retries, tool use, etc. stay internal
    response, _ := synapse.Process(ctx, session)
    return &chit.Response{Content: response}, nil
})

// Add resilience via pipz options
chat := chit.New(processor, emitter,
    chit.WithRetry(3),
    chit.WithTimeout(30*time.Second),
    chit.WithCircuitBreaker(5, time.Minute),
)

Your processor implementation can use zyn for typed synapses, raw API calls, or any other approach. Chit doesn't care — it just manages what the user sees.

Documentation

  • Learn
    • Overview — Purpose and design philosophy
    • Quickstart — Get started in minutes
    • Concepts — Processors, results, emitters, history
    • Architecture — Internal design and pipeline integration
  • Guides
  • Reference
    • API — Complete function documentation
    • Types — Message, Result, Response, Yield

Contributing

See CONTRIBUTING.md for guidelines.

License

MIT License — see LICENSE for details.

Documentation

Overview

Package chit provides conversation lifecycle management for LLM-powered chat applications.

Chit is a focused conversation controller that orchestrates pluggable processors for reasoning, action execution, and data retrieval. It handles turn-taking, session management, and response streaming.

Index

Constants

This section is empty.

Variables

View Source
var (
	// ErrUnknownResultType is returned when a Processor returns a Result
	// that is neither a Response nor a Yield.
	ErrUnknownResultType = errors.New("chit: unknown result type")

	// ErrNilProcessor is returned when attempting to create a Chat without a Processor.
	ErrNilProcessor = errors.New("chit: processor is required")

	// ErrNilEmitter is returned when attempting to create a Chat without an Emitter.
	ErrNilEmitter = errors.New("chit: emitter is required")

	// ErrEmitterClosed is returned when attempting to emit after the Emitter is closed.
	ErrEmitterClosed = errors.New("chit: emitter is closed")
)

Sentinel errors for chit operations.

View Source
var (
	// Chat lifecycle signals.
	ChatCreated = capitan.NewSignal(
		"chit.chat.created",
		"New chat conversation initiated",
	)

	// Input signals.
	InputReceived = capitan.NewSignal(
		"chit.input.received",
		"User input received for processing",
	)

	// Processing signals.
	ProcessingStarted = capitan.NewSignal(
		"chit.processing.started",
		"Processor began handling input",
	)
	ProcessingCompleted = capitan.NewSignal(
		"chit.processing.completed",
		"Processor finished handling input",
	)
	ProcessingFailed = capitan.NewSignal(
		"chit.processing.failed",
		"Processor encountered an error",
	)

	// Response signals.
	ResponseEmitted = capitan.NewSignal(
		"chit.response.emitted",
		"Response message emitted to client",
	)

	// Resource signals.
	ResourcePushed = capitan.NewSignal(
		"chit.resource.pushed",
		"Structured resource pushed to client",
	)

	// Turn management signals.
	TurnYielded = capitan.NewSignal(
		"chit.turn.yielded",
		"Processing yielded awaiting input",
	)
	TurnResumed = capitan.NewSignal(
		"chit.turn.resumed",
		"Processing resumed from continuation",
	)
)

Signal definitions for chit conversation lifecycle events. Signals follow the pattern: chit.<entity>.<event>.

View Source
var (
	// Chat metadata.
	FieldChatID = capitan.NewStringKey("chat_id")

	// Input metadata.
	FieldInput     = capitan.NewStringKey("input")
	FieldInputSize = capitan.NewIntKey("input_size")

	// Processing metadata.
	FieldProcessingDuration = capitan.NewDurationKey("processing_duration")

	// Response metadata.
	FieldRole        = capitan.NewStringKey("role")
	FieldContentSize = capitan.NewIntKey("content_size")

	// Resource metadata.
	FieldResourceType = capitan.NewStringKey("resource_type")
	FieldResourceURI  = capitan.NewStringKey("resource_uri")

	// Turn metadata.
	FieldPrompt = capitan.NewStringKey("prompt")

	// Error information.
	FieldError = capitan.NewErrorKey("error")
)

Field keys for chit event data.

Functions

func NewContinuationTerminal

func NewContinuationTerminal(cont Continuation) pipz.Chainable[*ChatRequest]

NewContinuationTerminal creates a terminal that calls the given continuation. Used to wrap continuation calls with the same reliability options as fresh calls.

func NewMiddleware

func NewMiddleware(fn func(context.Context, *ChatRequest) (*ChatRequest, error)) pipz.Chainable[*ChatRequest]

NewMiddleware creates middleware from a callback function. The callback can inspect/modify the request or return an error to halt processing. For advanced use cases, create pipz processors directly and pass to WithMiddleware.

func NewTerminal

func NewTerminal(processor Processor) pipz.Chainable[*ChatRequest]

NewTerminal creates the terminal processor that calls the Processor. This is the innermost layer of the pipeline that performs the actual processing.

func WithEmitter

func WithEmitter(ctx context.Context, e Emitter) context.Context

WithEmitter returns a new context with the given Emitter attached. Processors can retrieve the Emitter via EmitterFromContext to push resources or stream partial responses during processing.

Types

type Chat

type Chat struct {
	// contains filtered or unexported fields
}

Chat is a conversation lifecycle controller. It orchestrates a pluggable Processor for reasoning and execution, manages turn-taking via continuations, and streams responses through an Emitter.

func New

func New(processor Processor, emitter Emitter, opts ...Option) *Chat

New creates a new Chat with the given processor and emitter. Both processor and emitter are required. Use functional options to configure the Chat further.

func (*Chat) Config

func (c *Chat) Config() *Config

Config returns the configuration for this Chat.

func (*Chat) GetPipeline

func (c *Chat) GetPipeline() pipz.Chainable[*ChatRequest]

GetPipeline returns the internal pipeline for composition. This implements ChatProvider for use with WithFallback.

func (*Chat) Handle

func (c *Chat) Handle(ctx context.Context, input string) error

Handle processes user input through the conversation lifecycle.

The lifecycle is:

  1. Add user input to history
  2. If a continuation exists, resume with the input
  3. Otherwise, process the input through the pipeline
  4. Add response to history and emit via Emitter
  5. If result is a Yield, store the continuation for next input

func (*Chat) HasContinuation

func (c *Chat) HasContinuation() bool

HasContinuation returns true if the Chat is awaiting input to resume a previously yielded processing step.

func (*Chat) History

func (c *Chat) History() []Message

History returns a copy of the conversation history.

func (*Chat) ID

func (c *Chat) ID() string

ID returns the unique identifier for this Chat.

type ChatProvider

type ChatProvider interface {
	GetPipeline() pipz.Chainable[*ChatRequest]
}

ChatProvider is implemented by types that can provide a pipeline for composition.

type ChatRequest

type ChatRequest struct {
	// Input fields
	Input   string    // User input to process
	History []Message // User conversation history (read-only snapshot)

	// Metadata fields
	ChatID    string // ID of the chat instance
	RequestID string // Unique identifier for this request

	// Output fields (populated by terminal)
	Result Result // Processing result (Response or Yield)
}

ChatRequest flows through the pipz pipeline. It contains input, conversation history (read-only), metadata, and output fields.

func (*ChatRequest) Clone

func (r *ChatRequest) Clone() *ChatRequest

Clone implements pipz.Cloner for concurrent middleware support.

type Config

type Config struct {
	// SystemPrompt is the system message that guides conversation behavior.
	SystemPrompt string

	// Metadata contains optional additional configuration data.
	Metadata map[string]any
}

Config holds configuration for a Chat.

type Continuation

type Continuation func(ctx context.Context, input string, history []Message) (Result, error)

Continuation is a function that resumes processing with new input. History provides the updated user conversation since the yield.

type Emitter

type Emitter interface {
	// Emit streams response content (conversational text) to the client.
	Emit(ctx context.Context, msg Message) error

	// Push sends structured resources synchronously to the client.
	// Used for delivering fetched data, context, or tool results.
	Push(ctx context.Context, resource Resource) error

	// Close signals the end of output and releases resources.
	Close() error
}

Emitter handles all output to the client. It supports both streaming response content and pushing structured resources.

func EmitterFromContext

func EmitterFromContext(ctx context.Context) Emitter

EmitterFromContext retrieves the Emitter from the context. Returns nil if no Emitter is present.

type Message

type Message struct {
	// Role identifies the message sender (e.g., "assistant", "system").
	Role string

	// Content is the text content of the message.
	Content string

	// Metadata contains optional additional information about the message.
	Metadata map[string]any
}

Message is a streamed response unit in a conversation.

type Option

type Option func(*Chat)

Option is a functional option for configuring a Chat.

func WithBackoff

func WithBackoff(maxAttempts int, baseDelay time.Duration) Option

WithBackoff adds retry logic with exponential backoff to the pipeline. Failed requests are retried with increasing delays between attempts.

func WithCircuitBreaker

func WithCircuitBreaker(failures int, recovery time.Duration) Option

WithCircuitBreaker adds circuit breaker protection to the pipeline. After 'failures' consecutive failures, the circuit opens for 'recovery' duration.

func WithConfig

func WithConfig(config *Config) Option

WithConfig sets the configuration for the Chat.

func WithErrorHandler

func WithErrorHandler(handler pipz.Chainable[*pipz.Error[*ChatRequest]]) Option

WithErrorHandler adds error handling to the pipeline. The error handler receives error context and can process/log/alert as needed.

func WithFallback

func WithFallback(fallback ChatProvider) Option

WithFallback adds a fallback chat for resilience. If the primary processor fails, the fallback will be tried.

func WithHistory

func WithHistory(history []Message) Option

WithHistory sets the initial conversation history for the Chat.

func WithMetadata

func WithMetadata(metadata map[string]any) Option

WithMetadata sets metadata on the Chat's configuration.

func WithMiddleware

func WithMiddleware(processors ...pipz.Chainable[*ChatRequest]) Option

WithMiddleware adds pre-processing steps before the terminal. Processors run in order, then the terminal (processor) runs.

func WithRateLimit

func WithRateLimit(rps float64, burst int) Option

WithRateLimit adds rate limiting to the pipeline. rps = requests per second, burst = burst capacity.

func WithRetry

func WithRetry(maxAttempts int) Option

WithRetry adds retry logic to the pipeline. Failed requests are retried up to maxAttempts times.

func WithSystemPrompt

func WithSystemPrompt(prompt string) Option

WithSystemPrompt sets the system prompt for the Chat. This is a convenience option that creates a Config with the system prompt.

func WithTimeout

func WithTimeout(duration time.Duration) Option

WithTimeout adds timeout protection to the pipeline. Operations exceeding this duration will be canceled.

type PipelineOption

type PipelineOption func(pipz.Chainable[*ChatRequest]) pipz.Chainable[*ChatRequest]

PipelineOption wraps a pipeline with additional behavior.

type Processor

type Processor interface {
	// Process handles user input and returns a result.
	// History provides the user conversation for context (read-only).
	// Returns a Response when complete, or a Yield when awaiting input.
	Process(ctx context.Context, input string, history []Message) (Result, error)
}

Processor handles reasoning and execution logic for a conversation. It receives user input and conversation history (read-only) for context, and returns a Result. How the processor communicates with LLMs is an internal implementation detail.

type ProcessorFunc

type ProcessorFunc func(ctx context.Context, input string, history []Message) (Result, error)

ProcessorFunc is an adapter that allows using a function as a Processor.

func (ProcessorFunc) Process

func (f ProcessorFunc) Process(ctx context.Context, input string, history []Message) (Result, error)

Process implements the Processor interface.

type Resource

type Resource struct {
	// Type identifies the kind of resource (e.g., "data", "context", "tool_result").
	Type string

	// URI is the scio URI if this resource was retrieved from a data source.
	URI string

	// Payload is the structured data being delivered.
	Payload any

	// Metadata contains optional additional information about the resource.
	Metadata map[string]any
}

Resource is structured data pushed to the client. Used for delivering fetched data, context enrichment, or tool results.

type Response

type Response struct {
	// Content is the response text to emit.
	Content string

	// Metadata contains optional additional information about the response.
	Metadata map[string]any
}

Response is a complete result with content to emit to the user.

func (*Response) IsComplete

func (r *Response) IsComplete() bool

IsComplete returns true for Response.

func (*Response) IsYielded

func (r *Response) IsYielded() bool

IsYielded returns false for Response.

type Result

type Result interface {
	// IsComplete returns true if processing finished with a response to emit.
	IsComplete() bool

	// IsYielded returns true if processing paused awaiting further input.
	IsYielded() bool
}

Result represents the outcome of processing user input. A result is either complete (Response) or yielded (Yield).

type Yield

type Yield struct {
	// Prompt is the message to emit while awaiting input (e.g., a question).
	Prompt string

	// Continuation is the function to call with the next input to resume processing.
	Continuation Continuation

	// Metadata contains optional additional information about the yield.
	Metadata map[string]any
}

Yield indicates processing has paused and is awaiting further input. The Continuation function resumes processing when input arrives.

func (*Yield) IsComplete

func (y *Yield) IsComplete() bool

IsComplete returns false for Yield.

func (*Yield) IsYielded

func (y *Yield) IsYielded() bool

IsYielded returns true for Yield.

Directories

Path Synopsis
Package testing provides test helpers for chit.
Package testing provides test helpers for chit.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL