Documentation
¶
Overview ¶
Package chat provides functionality for interacting with Large Language Models (LLMs) through both direct prompts and streaming interfaces.
This package implements core chat functionality for the aigent application, including:
- Direct prompt handling for simple question-answer interactions
- Streaming chat responses for real-time UI updates
- Tool execution capabilities allowing LLMs to perform actions
- Session management for maintaining conversation context
The package is designed around several key components:
- Manager: Represents a conversation with an LLM, maintaining message history
- StreamHandler: Interface for handling streaming responses from LLMs
- ToolRegistry: Manages available tools that LLMs can use to perform actions
Typical usage involves creating a Manager, configuring options through one of the available Options structs (CoreChatOptions, PromptOptions, or StreamOptions), and then either handling a direct prompt or processing a streaming chat response.
Tool execution is a central feature, allowing LLMs to perform actions like retrieving the current time. The package provides a SetupToolRegistry function to configure default tools, and both direct and streaming interfaces support tool execution.
Example usage for direct prompt handling:
options := chat.PromptOptions{ CoreChatOptions: chat.CoreChatOptions{ Client: client, Provider: "openai", ModelID: "gpt-4", Logger: logger, ToolRegistry: chat.SetupToolRegistry(logger), Temperature: 0.7, MaxTokens: 2000, }, EnableTools: true, } err := chat.HandleDirectPrompt(ctx, "What time is it?", options)
Example usage for streaming chat:
session := &chat.Manager{ Messages: []llm.Message{ {Role: llm.RoleUser, Content: "What time is it?"}, }, } handler := &MyStreamHandler{} // Implements StreamHandler interface options := chat.StreamOptions{ CoreChatOptions: chat.CoreChatOptions{ Client: client, Provider: "openai", ModelID: "gpt-4", Logger: logger, ToolRegistry: chat.SetupToolRegistry(logger), Temperature: 0.7, MaxTokens: 2000, }, Handler: handler, } err := chat.ProcessChatStream(ctx, session, options)
Index ¶
- func AddMCPToolsToPersona(logger *slog.Logger, p persona.Persona, mcpManager mcp.Manager)
- func CleanupAllPersonas(logger *slog.Logger, personaRegistry *persona.Registry)
- func GetSystemPromptFromPersona(logger *slog.Logger, personaRegistry *persona.Registry, personaID string) string
- func GetToolRegistryFromPersona(logger *slog.Logger, personaRegistry *persona.Registry, personaID string, ...) *llm.ToolRegistry
- func HandleDirectPrompt(ctx context.Context, prompt string, options PromptOptions) error
- func ProcessChatStream(ctx context.Context, session *Manager, options StreamOptions) error
- func SetupPersonaRegistry(logger *slog.Logger) *persona.Registry
- func SetupToolRegistry(logger *slog.Logger) *llm.ToolRegistry
- type ConsoleStreamHandler
- type Conversation
- type CoreChatOptions
- type MCPManager
- type MCPServer
- type MCPServerConfig
- type Manager
- func (s *Manager) AddConversation() string
- func (s *Manager) AddMessage(message llm.Message)
- func (s *Manager) AddMessageToConversation(conversationID string, message llm.Message) bool
- func (s *Manager) GetActiveConversation() *Conversation
- func (s *Manager) GetConversationByID(id string) *Conversation
- func (s *Manager) GetMessages() []llm.Message
- func (cs *Manager) RemoveConversation(id string) error
- func (s *Manager) SetActiveConversation(id string) bool
- func (s *Manager) SetMessages(messages []llm.Message)
- type NoopStreamHandler
- type PromptOptions
- type StreamHandler
- type StreamOptions
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func AddMCPToolsToPersona ¶
AddMCPToolsToPersona adds MCP tools to a persona
func CleanupAllPersonas ¶
CleanupAllPersonas calls the Cleanup method on all registered personas
func GetSystemPromptFromPersona ¶
func GetSystemPromptFromPersona(logger *slog.Logger, personaRegistry *persona.Registry, personaID string) string
GetSystemPromptFromPersona returns the system prompt for the specified persona. If the persona is not found or an error occurs, it returns a default system prompt.
func GetToolRegistryFromPersona ¶
func GetToolRegistryFromPersona(logger *slog.Logger, personaRegistry *persona.Registry, personaID string, mcpManager mcp.Manager) *llm.ToolRegistry
GetToolRegistryFromPersona creates a tool registry based on the specified persona. If the persona is not found or an error occurs, it falls back to the default tool registry.
func HandleDirectPrompt ¶
func HandleDirectPrompt(ctx context.Context, prompt string, options PromptOptions) error
HandleDirectPrompt processes a prompt directly without UI and returns the response. It handles the complete lifecycle of a prompt, including potential tool calls and their execution. The function continues processing until no more tool calls are returned or an error occurs.
func ProcessChatStream ¶
func ProcessChatStream(ctx context.Context, session *Manager, options StreamOptions) error
ProcessChatStream handles a streaming chat response, including tool execution. It processes the stream chunk by chunk, accumulating content and tool calls, and executes tools when the stream is complete. The function will recursively call itself to continue the conversation after tool execution.
func SetupPersonaRegistry ¶
SetupPersonaRegistry creates and configures a persona registry with available personas.
func SetupToolRegistry ¶
func SetupToolRegistry(logger *slog.Logger) *llm.ToolRegistry
SetupToolRegistry creates and configures a tool registry with default tools. It initializes a new tool registry and registers the get_time tool, which allows LLMs to retrieve the current time.
Types ¶
type ConsoleStreamHandler ¶
type ConsoleStreamHandler struct {
NoopStreamHandler
}
ConsoleStreamHandler handles streaming output to the console
func (*ConsoleStreamHandler) OnChunk ¶
func (h *ConsoleStreamHandler) OnChunk(chunk *llm.ChatResponseChunk)
func (*ConsoleStreamHandler) OnComplete ¶
func (h *ConsoleStreamHandler) OnComplete(fullContent string, toolCalls []llm.ToolCall, err error)
type Conversation ¶ added in v0.1.0
Conversation represents a single conversation thread within a chat session
type CoreChatOptions ¶
type CoreChatOptions struct { Client *llm.Client Provider string ModelID string Logger *slog.Logger ToolRegistry *llm.ToolRegistry Temperature float32 MaxTokens int NoStream bool // Whether to disable streaming responses SystemPrompt string // System prompt to use for the chat }
CoreChatOptions contains configuration options shared between UI and non-UI chat functionality. It provides the core settings needed for interacting with LLMs, including client configuration, model selection, logging, tool support, and generation parameters.
type MCPManager ¶
type MCPManager struct {
// contains filtered or unexported fields
}
MCPManager manages multiple MCP servers
func NewMCPManager ¶
func NewMCPManager(logger *slog.Logger) *MCPManager
NewMCPManager creates a new MCP manager
func (*MCPManager) AddServer ¶
func (m *MCPManager) AddServer(ctx context.Context, cfg mcp.ServerConfig) error
AddServer adds and initializes a new MCP server
func (*MCPManager) Close ¶
func (m *MCPManager) Close() error
Close closes all MCP server connections
type MCPServer ¶
type MCPServer struct { Client *mcplib.Client Tools []mcplib.ToolRetType // contains filtered or unexported fields }
MCPServer represents a connected MCP server with its tools and the underlying command
type MCPServerConfig ¶
type MCPServerConfig = mcp.ServerConfig
MCPServerConfig is an alias for mcp.ServerConfig for backward compatibility
type Manager ¶ added in v0.1.0
type Manager struct { Conversations []Conversation ActiveConversationID string }
Manager represents a chat session with multiple conversations
func NewManager ¶ added in v0.1.0
func NewManager() *Manager
NewManager creates a new chat session with a default empty conversation
func (*Manager) AddConversation ¶ added in v0.1.0
AddConversation adds a new conversation to the session
func (*Manager) AddMessage ¶ added in v0.1.0
AddMessage adds a message to the active conversation
func (*Manager) AddMessageToConversation ¶ added in v0.1.0
AddMessageToConversation adds a message to the specified conversation. Returns true if the conversation was found and message added, false otherwise.
func (*Manager) GetActiveConversation ¶ added in v0.1.0
func (s *Manager) GetActiveConversation() *Conversation
GetActiveConversation returns the currently active conversation
func (*Manager) GetConversationByID ¶ added in v0.1.0
func (s *Manager) GetConversationByID(id string) *Conversation
GetConversationByID returns a conversation by its ID, or nil if not found.
func (*Manager) GetMessages ¶ added in v0.1.0
GetMessages returns the messages from the active conversation This maintains backward compatibility with code expecting a Messages field
func (*Manager) RemoveConversation ¶ added in v0.1.0
RemoveConversation removes a conversation by its ID
func (*Manager) SetActiveConversation ¶ added in v0.1.0
SetActiveConversation sets the active conversation by ID
func (*Manager) SetMessages ¶ added in v0.1.0
SetMessages sets the messages for the active conversation This maintains backward compatibility with code expecting a Messages field
type NoopStreamHandler ¶
type NoopStreamHandler struct{}
NoopStreamHandler is a stream handler implementation that does nothing. It can be used as a default handler when no specific handling is needed.
func (*NoopStreamHandler) OnChunk ¶
func (h *NoopStreamHandler) OnChunk(chunk *llm.ChatResponseChunk)
func (*NoopStreamHandler) OnComplete ¶
func (h *NoopStreamHandler) OnComplete(fullContent string, toolCalls []llm.ToolCall, err error)
func (*NoopStreamHandler) OnToolExecution ¶
func (h *NoopStreamHandler) OnToolExecution(toolCall llm.ToolCall, result string, err error)
type PromptOptions ¶
type PromptOptions struct { CoreChatOptions EnableTools bool ToolApprovalPolicy string // Policy for tool call approval: "auto-approve", "always-deny", "smart" }
PromptOptions contains configuration for handling direct prompts. It extends CoreChatOptions with additional settings specific to direct prompt handling.
type StreamHandler ¶
type StreamHandler interface { OnChunk(chunk *llm.ChatResponseChunk) OnComplete(fullContent string, toolCalls []llm.ToolCall, err error) OnToolExecution(toolCall llm.ToolCall, result string, err error) }
StreamHandler is an interface for handling streaming chat events. Implementations receive notifications about chunks of content as they arrive, when the stream completes, and when tools are executed.
type StreamOptions ¶
type StreamOptions struct { CoreChatOptions Handler StreamHandler ToolApprovalPolicy string // Policy for tool call approval: "auto-approve", "always-deny", "smart" HumanApprovalHandler toolapproval.HumanApprovalHandler // Optional handler for human approval }
StreamOptions contains options for streaming chat responses. It extends CoreChatOptions with a handler for processing streaming events.