Documentation
¶
Overview ¶
Package face provides reusable Bubble Tea components for building LLM-powered conversational TUIs on top of axon-loop.
Class: domain UseWhen: Building terminal user interfaces for axon services.
Package face provides reusable Bubble Tea components for building LLM-powered conversational TUIs on top of axon-loop.
Index ¶
- Constants
- func SetupLogging(dir string) (func(), error)
- func WaitForEvent(ch <-chan loop.Event) tea.Cmd
- func WordWrap(s string, width int) string
- type Chat
- func (c *Chat) AppendEntry(e Entry)
- func (c *Chat) HandleKey(msg tea.KeyMsg) (cmd tea.Cmd, handled bool)
- func (c *Chat) HandleResize(msg tea.WindowSizeMsg)
- func (c *Chat) HandleStreamTick(msg StreamTickMsg) tea.Cmd
- func (c *Chat) InitCmd() tea.Cmd
- func (c *Chat) RefreshViewport()
- func (c *Chat) SendUser(content string)
- func (c *Chat) StartStream(client loop.LLMClient, req *loop.Request, tools map[string]tool.ToolDef) tea.Cmd
- func (c *Chat) UpdateInput(msg tea.Msg) tea.Cmd
- func (c *Chat) View(status string) string
- type Entry
- type Session
- type StreamEvent
- type StreamTickMsg
- type Styles
- type ToolUseEvent
Constants ¶
const ( RoleUser = "user" RoleAgent = "agent" RoleTool = "tool" RoleAction = "action" )
Role constants for conversation entries.
Variables ¶
This section is empty.
Functions ¶
func SetupLogging ¶
SetupLogging creates a session log file in dir and configures slog to write JSON to it. Returns a cleanup function that closes the file.
func WaitForEvent ¶
WaitForEvent reads the next loop.Event from the stream channel and converts it to a StreamTickMsg for Bubble Tea's update loop.
Types ¶
type Chat ¶
type Chat struct {
Entries []Entry
Streaming string
Waiting bool
Input textarea.Model
Viewport viewport.Model
Messages []loop.Message
Width int
Height int
Ready bool
AgentName string
Styles Styles
}
Chat is an embeddable conversational TUI component. It manages a viewport, textarea, streaming state, and conversation history. It does not implement tea.Model — the outer model calls its methods.
func (*Chat) AppendEntry ¶
AppendEntry adds a display-only entry (no effect on LLM history).
func (*Chat) HandleKey ¶
HandleKey processes base key events (enter, tab, ctrl+c). Returns a command and whether the key was handled. On enter: appends the user message, sets Waiting=true, and returns handled=true with a nil command. The outer model should call StartStream.
func (*Chat) HandleResize ¶
func (c *Chat) HandleResize(msg tea.WindowSizeMsg)
HandleResize updates dimensions and re-renders the viewport.
func (*Chat) HandleStreamTick ¶
func (c *Chat) HandleStreamTick(msg StreamTickMsg) tea.Cmd
HandleStreamTick processes a streaming event from the LLM. Returns a command to continue reading from the stream, or nil when done.
func (*Chat) InitCmd ¶
InitCmd returns the textarea blink command for use in the outer model's Init().
func (*Chat) RefreshViewport ¶
func (c *Chat) RefreshViewport()
RefreshViewport rebuilds the viewport content from entries and streaming state.
func (*Chat) SendUser ¶
SendUser appends a user message to both entries and LLM history, and sets the chat to waiting state.
func (*Chat) StartStream ¶
func (c *Chat) StartStream(client loop.LLMClient, req *loop.Request, tools map[string]tool.ToolDef) tea.Cmd
StartStream launches an LLM conversation via loop.Stream and returns a Bubble Tea command that feeds events back through StreamTickMsg.
func (*Chat) UpdateInput ¶
UpdateInput forwards a message to the textarea and viewport sub-components. Call this in the outer model's Update for messages not handled by HandleKey.
type Session ¶
type Session struct {
ID string `json:"id"`
Messages []talk.Message `json:"messages"`
Phase string `json:"phase,omitempty"`
State map[string]any `json:"state,omitempty"`
CreatedAt time.Time `json:"created_at"`
UpdatedAt time.Time `json:"updated_at"`
Complete bool `json:"complete"`
}
Session represents a persisted conversation session. The State field is an escape hatch for app-specific data — apps marshal their own types into it (sections, approvals, etc.).
func FindIncomplete ¶
FindIncomplete returns the most recent incomplete session in dir, or nil if none exists.
func LoadSession ¶
LoadSession reads a specific session by ID from dir.
func (*Session) MarkComplete ¶
MarkComplete marks the session as finished and saves it.
type StreamEvent ¶
type StreamEvent struct {
Token string
Tool *ToolUseEvent
Done bool
Err error
Content string // final content on done
}
StreamEvent is a parsed event from the LLM stream.
type StreamTickMsg ¶
type StreamTickMsg struct {
Event StreamEvent
Ch <-chan loop.Event
}
StreamTickMsg wraps a stream event for the Bubble Tea update loop. The channel carries the next event for re-subscription.
type Styles ¶
type Styles struct {
User lipgloss.Style
Agent lipgloss.Style
AgentLabel lipgloss.Style
Tool lipgloss.Style
Action lipgloss.Style
Approved lipgloss.Style
Rejected lipgloss.Style
Status lipgloss.Style
Model lipgloss.Style
}
Styles holds the lipgloss styles used by the Chat viewport.
func DefaultStyles ¶
func DefaultStyles() Styles
DefaultStyles returns the standard colour palette shared across axon-face applications.
type ToolUseEvent ¶
ToolUseEvent carries tool invocation details from the LLM.