face

package module
v0.11.1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 21, 2026 License: MIT Imports: 15 Imported by: 0

README

axon-face

Reusable Bubble Tea components for building LLM-powered conversational TUIs on top of axon-loop.

Import: github.com/benaskins/axon-face

What it does

axon-face provides an embeddable chat component that handles the full lifecycle of an LLM conversation in the terminal: user input, streaming responses, tool use display, session persistence, and styled rendering via lipgloss.

Consumer apps (imago, vita) compose the Chat component into their own Bubble Tea models.

Components

Type Purpose
Chat Embeddable conversational TUI: viewport, textarea, streaming state
Session Persisted conversation with messages, phase, and timestamps
Entry Single item in the conversation view (user, agent, tool, action)
StreamTickMsg Bridges axon-loop events into the Bubble Tea update cycle
Styles Configurable lipgloss styles for all conversation elements

Usage

import face "github.com/benaskins/axon-face"

chat := face.New("my-agent")
cmd := chat.StartStream(client, req, tools)

The Chat component exposes HandleKey, HandleStreamTick, HandleResize, and View for integration into a parent Bubble Tea model.

Session management

Sessions are JSON files that track conversation state across restarts:

session := face.NewSession()
session.Save(dir)

incomplete := face.FindIncomplete(dir)

Dependencies

  • axon-loop, axon-talk, axon-tool
  • bubbletea, bubbles, lipgloss (Charm)

Build & Test

go test ./...
go vet ./...

Documentation

Overview

Package face provides reusable Bubble Tea components for building LLM-powered conversational TUIs on top of axon-loop.

Class: domain UseWhen: Building terminal user interfaces for axon services.

Package face provides reusable Bubble Tea components for building LLM-powered conversational TUIs on top of axon-loop.

Index

Constants

View Source
const (
	RoleUser   = "user"
	RoleAgent  = "agent"
	RoleTool   = "tool"
	RoleAction = "action"
)

Role constants for conversation entries.

Variables

This section is empty.

Functions

func SetupLogging

func SetupLogging(dir string) (func(), error)

SetupLogging creates a session log file in dir and configures slog to write JSON to it. Returns a cleanup function that closes the file.

func WaitForEvent

func WaitForEvent(ch <-chan loop.Event) tea.Cmd

WaitForEvent reads the next loop.Event from the stream channel and converts it to a StreamTickMsg for Bubble Tea's update loop.

func WordWrap

func WordWrap(s string, width int) string

WordWrap wraps text at the given width on word boundaries.

Types

type Chat

type Chat struct {
	Entries   []Entry
	Streaming string
	Waiting   bool

	Input    textarea.Model
	Viewport viewport.Model
	Messages []loop.Message

	Width  int
	Height int
	Ready  bool

	AgentName string
	Styles    Styles
}

Chat is an embeddable conversational TUI component. It manages a viewport, textarea, streaming state, and conversation history. It does not implement tea.Model — the outer model calls its methods.

func New

func New(agentName string) Chat

New creates a Chat with sensible defaults.

func (*Chat) AppendEntry

func (c *Chat) AppendEntry(e Entry)

AppendEntry adds a display-only entry (no effect on LLM history).

func (*Chat) HandleKey

func (c *Chat) HandleKey(msg tea.KeyMsg) (cmd tea.Cmd, handled bool)

HandleKey processes base key events (enter, tab, ctrl+c). Returns a command and whether the key was handled. On enter: appends the user message, sets Waiting=true, and returns handled=true with a nil command. The outer model should call StartStream.

func (*Chat) HandleResize

func (c *Chat) HandleResize(msg tea.WindowSizeMsg)

HandleResize updates dimensions and re-renders the viewport.

func (*Chat) HandleStreamTick

func (c *Chat) HandleStreamTick(msg StreamTickMsg) tea.Cmd

HandleStreamTick processes a streaming event from the LLM. Returns a command to continue reading from the stream, or nil when done.

func (*Chat) InitCmd

func (c *Chat) InitCmd() tea.Cmd

InitCmd returns the textarea blink command for use in the outer model's Init().

func (*Chat) RefreshViewport

func (c *Chat) RefreshViewport()

RefreshViewport rebuilds the viewport content from entries and streaming state.

func (*Chat) SendUser

func (c *Chat) SendUser(content string)

SendUser appends a user message to both entries and LLM history, and sets the chat to waiting state.

func (*Chat) StartStream

func (c *Chat) StartStream(client loop.LLMClient, req *loop.Request, tools map[string]tool.ToolDef) tea.Cmd

StartStream launches an LLM conversation via loop.Stream and returns a Bubble Tea command that feeds events back through StreamTickMsg.

func (*Chat) UpdateInput

func (c *Chat) UpdateInput(msg tea.Msg) tea.Cmd

UpdateInput forwards a message to the textarea and viewport sub-components. Call this in the outer model's Update for messages not handled by HandleKey.

func (*Chat) View

func (c *Chat) View(status string) string

View returns the composed layout: viewport + status bar + input. The caller provides the status text (e.g. "thinking..." or key hints).

type Entry

type Entry struct {
	Role      string
	Content   string
	Collapsed bool // tool entries can be toggled
}

Entry is a single item in the conversation view.

type Session

type Session struct {
	ID        string         `json:"id"`
	Messages  []talk.Message `json:"messages"`
	Phase     string         `json:"phase,omitempty"`
	State     map[string]any `json:"state,omitempty"`
	CreatedAt time.Time      `json:"created_at"`
	UpdatedAt time.Time      `json:"updated_at"`
	Complete  bool           `json:"complete"`
}

Session represents a persisted conversation session. The State field is an escape hatch for app-specific data — apps marshal their own types into it (sections, approvals, etc.).

func FindIncomplete

func FindIncomplete(dir string) *Session

FindIncomplete returns the most recent incomplete session in dir, or nil if none exists.

func LoadSession

func LoadSession(dir, id string) (*Session, error)

LoadSession reads a specific session by ID from dir.

func NewSession

func NewSession() *Session

NewSession creates a session with a timestamped ID.

func (*Session) MarkComplete

func (s *Session) MarkComplete(dir string) error

MarkComplete marks the session as finished and saves it.

func (*Session) Save

func (s *Session) Save(dir string) error

Save writes the session to dir as a JSON file.

type StreamEvent

type StreamEvent struct {
	Token   string
	Tool    *ToolUseEvent
	Done    bool
	Err     error
	Content string // final content on done
}

StreamEvent is a parsed event from the LLM stream.

type StreamTickMsg

type StreamTickMsg struct {
	Event StreamEvent
	Ch    <-chan loop.Event
}

StreamTickMsg wraps a stream event for the Bubble Tea update loop. The channel carries the next event for re-subscription.

type Styles

type Styles struct {
	User       lipgloss.Style
	Agent      lipgloss.Style
	AgentLabel lipgloss.Style
	Tool       lipgloss.Style
	Action     lipgloss.Style
	Approved   lipgloss.Style
	Rejected   lipgloss.Style
	Status     lipgloss.Style
	Model      lipgloss.Style
}

Styles holds the lipgloss styles used by the Chat viewport.

func DefaultStyles

func DefaultStyles() Styles

DefaultStyles returns the standard colour palette shared across axon-face applications.

type ToolUseEvent

type ToolUseEvent struct {
	Name string
	Args map[string]any
}

ToolUseEvent carries tool invocation details from the LLM.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL