Documentation
¶
Index ¶
- Variables
- type Blueprint
- type Context
- type Event
- type EventChunk
- type EventComplete
- type EventContent
- type EventError
- type EventFlowEnd
- type EventFlowStart
- type EventNodeEnd
- type EventNodeStart
- type Flow
- type FlowOption
- type GenerateRequest
- type GenerateResponse
- type Hooks
- type Input
- type Interceptor
- type LLM
- type Message
- type MessageRole
- type Node
- type NodeInitializer
- type Option
- type Runner
- type Tool
- type ToolCall
Constants ¶
This section is empty.
Variables ¶
var ( // ErrMaxRoundsExceeded is returned when a run exceeds the configured // maximum number of LLM round-trips without producing a final response. ErrMaxRoundsExceeded = errors.New("ork: max rounds exceeded") // ErrCycleDetected is returned by NewFlow when the node graph contains a cycle. ErrCycleDetected = errors.New("ork: cycle detected in node graph") // ErrDuplicateFlow is returned by NewBlueprint when two flows share the same name. ErrDuplicateFlow = errors.New("ork: duplicate flow name") // ErrUnknownFlow is returned when the LLM invokes a flow not present in the blueprint. ErrUnknownFlow = errors.New("ork: unknown flow") )
Functions ¶
This section is empty.
Types ¶
type Blueprint ¶
type Blueprint struct {
// contains filtered or unexported fields
}
Blueprint groups one or more flows. The LLM uses each flow's name and description to decide which to invoke for a given conversation turn. All flows are compiled and validated at construction time.
func NewBlueprint ¶
NewBlueprint creates a Blueprint from the provided flows. Returns an error if any flow has graph issues (cycles, missing nodes) or if two flows share the same name.
type Context ¶
type Context struct {
// Ctx is the execution context for cancellations and timeouts.
Ctx context.Context
// Args contains trigger arguments provided by the user or the LLM (Read-only).
Args map[string]any
// Vars is the shared blackboard seeded by Input.Vars and mutated by nodes (Read/Write).
Vars map[string]any
// contains filtered or unexported fields
}
Context provides read-write state and actions for a single node execution.
type Event ¶
type Event interface {
// contains filtered or unexported methods
}
Event is the interface implemented by all runtime events emitted by Run and RunFlow. Use a type switch to handle specific event types.
type EventChunk ¶
type EventChunk struct {
Chunk string
}
EventChunk is emitted for each token chunk streamed from the LLM.
type EventComplete ¶
type EventComplete struct {
// Content is the LLM's final text response, if the run ended with
// a plain generation rather than a flow termination.
Content string
// Outcome is set when the run ended via node.End.
Outcome string
// Vars contains the final mutated blackboard state.
Vars map[string]any
}
EventComplete is emitted when the run finishes successfully. StoreWrites is the merged result of all NodeOutput.StoreWrites produced during the run.
type EventContent ¶
type EventContent struct {
Content string
}
EventContent is emitted when a node produces static rendered text, such as a resolved message template.
type EventError ¶
type EventError struct {
Err error
}
EventError is emitted when an unrecoverable error occurs. The channel closes immediately after this event.
type EventFlowEnd ¶
type EventFlowEnd struct {
FlowName string
ToolCallID string // correlates with EventFlowStart
Result map[string]any // the vars produced by the flow
}
EventFlowEnd is emitted when a flow finishes executing successfully, before its result is fed back to the LLM.
type EventFlowStart ¶
type EventFlowStart struct {
FlowName string
ToolCallID string // correlates with EventFlowEnd
Args map[string]any
}
EventFlowStart is emitted when the LLM selects a flow and its execution begins.
type EventNodeEnd ¶
type EventNodeEnd struct {
NodeID string
}
EventNodeEnd is emitted immediately after a node finishes without error.
type EventNodeStart ¶
type EventNodeStart struct {
NodeID string
}
EventNodeStart is emitted immediately before a node begins executing.
type Flow ¶
type Flow struct {
// contains filtered or unexported fields
}
Flow is a compiled, highly-optimized array of nodes in topological order.
type FlowOption ¶
type FlowOption func(*flowBuilder) error
FlowOption configures a Flow.
func WithDescription ¶
func WithDescription(desc string) FlowOption
WithDescription sets the description of the flow.
func WithSchema ¶
func WithSchema(schema map[string]any) FlowOption
WithSchema sets the schema of the flow.
type GenerateRequest ¶
type GenerateRequest struct {
// Messages is the full conversation window.
Messages []Message
// Tools is the set of flows exposed to the LLM as callable functions.
// Ork constructs this from the blueprint automatically.
Tools []Tool
// OnChunk is called for each streamed token chunk. Nil means no streaming.
// Returning an error cancels the generation.
OnChunk func(chunk string) error
}
GenerateRequest is the input to a single LLM generation call.
type GenerateResponse ¶
type GenerateResponse struct {
// Message is the assistant's response always populated.
// Set Message.ToolCalls when the model invokes one or more flows.
Message Message
}
GenerateResponse is the output of a single LLM generation call.
type Hooks ¶
type Hooks struct {
// BeforeNode fires immediately before a node begins executing.
BeforeNode func(ctx *Context, nodeID string)
// AfterNode fires immediately after a node finishes without error.
AfterNode func(ctx *Context, nodeID string)
// OnError fires when a node returns an error.
OnError func(ctx *Context, nodeID string, err error)
}
Hooks fire around every node execution. They must not block.
type Input ¶
type Input struct {
// Messages is the full conversation history including the current turn.
// Ork treats this as read-only and never appends or modifies the slice.
Messages []Message
// Args provides manual trigger arguments for RunFlow.
// Ignored by Run (the LLM generates them instead).
Args map[string]any
// Vars seeds the flow's blackboard. Use this to pass application state
// like API keys, user IDs, or database state.
Vars map[string]any
}
Input is the data provided by the caller for a single run.
type Interceptor ¶
type Interceptor interface {
// Before is called with the full message window before each LLM call.
// The returned slice replaces the original.
Before(ctx context.Context, messages []Message) ([]Message, error)
// After is called with the LLM response before Ork processes it.
// The returned message replaces the original.
After(ctx context.Context, response Message) (Message, error)
}
Interceptor wraps every LLM generation call. It can be used for PII redaction, content filtering, prompt enrichment, or policy enforcement. Returning an error from either method aborts the run.
type LLM ¶
type LLM interface {
// Generate sends a request to the model and returns its response.
// If req.OnChunk is non-nil, it is called with each token as it streams.
Generate(ctx context.Context, req *GenerateRequest) (*GenerateResponse, error)
}
LLM is the interface Ork requires to interact with a language model. Implement this to integrate any provider.
type Message ¶
type Message struct {
Role MessageRole
Content string
// ToolCalls is set on assistant messages when the LLM invokes flows.
ToolCalls []ToolCall
// ToolCallID links a tool result message to its originating call.
ToolCallID string
}
Message is a single entry in a conversation.
type MessageRole ¶
type MessageRole string
MessageRole is the author of a message.
const ( MessageUser MessageRole = "user" MessageAssistant MessageRole = "assistant" MessageTool MessageRole = "tool" MessageSystem MessageRole = "system" )
type Node ¶
type Node interface {
// Execute performs the node's work. It can mutate ctx.Vars and ctx.Writes directly.
Execute(orkCtx *Context) error
}
Node is the interface every executable unit must implement.
type NodeInitializer ¶
type NodeInitializer interface {
// Initialize sets up the node for execution.
Initialize() error
}
NodeInitializer is used to set up a node for execution.
type Option ¶
type Option func(*Runner)
Option configures a Runner.
func WithInterceptor ¶
func WithInterceptor(i Interceptor) Option
WithInterceptor configures a Runner with the provided interceptor.
func WithMaxRounds ¶
WithMaxRounds configures a Runner with the provided maxRounds.
type Runner ¶
type Runner struct {
// contains filtered or unexported fields
}
Runner executes blueprints and flows. It holds execution infrastructure and is safe for concurrent use. Create one per application, not per request.
Source Files
¶
Directories
¶
| Path | Synopsis |
|---|---|
|
internal
|
|
|
samples
|
|
|
01_basics_and_state
command
|
|
|
02_external_api_templating
command
|
|
|
03_llm_tool_routing
command
samples/03_llm_tool_routing/main.go
|
samples/03_llm_tool_routing/main.go |
|
04_tracing_and_hooks
command
|
|
|
05_pii_redaction_interceptor
command
samples/05_pii_redaction_interceptor/main.go
|
samples/05_pii_redaction_interceptor/main.go |