Documentation
¶
Overview ¶
Package mock provides a deterministic LLMProvider for local testing. It returns canned responses without making any API calls, making it ideal for testing ArkFlow logic (DAG order, conditionals, loops) without burning API credits or requiring an Anthropic API key.
Usage — register and select:
import _ "github.com/arkonis-dev/ark/pkg/agent/providers/mock"
provider, _ := providers.New("mock")
Or construct directly for fine-grained control:
p := &mock.Provider{
Responses: map[string]string{
"research": "Here are the findings...",
"summarize": "• Point one\n• Point two",
},
Default: "mock response",
Delay: 200 * time.Millisecond,
}
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Provider ¶
type Provider struct {
// Responses maps prompt substrings to canned reply text.
// Matched case-insensitively; first match wins.
Responses map[string]string
// Default is the response returned when no Responses entry matches.
// Defaults to "mock response" when constructed via the registry.
Default string
// Delay simulates LLM latency. Zero means no delay.
Delay time.Duration
}
Provider is a configurable mock LLM backend.
Response matching: for each entry in Responses, if the task prompt contains the key (case-insensitive substring match), that value is returned. Falls back to Default when nothing matches.
func (*Provider) RunTask ¶
func (p *Provider) RunTask( ctx context.Context, _ *config.Config, task queue.Task, _ []mcp.Tool, _ func(context.Context, string, json.RawMessage) (string, error), chunkFn func(string), ) (string, queue.TokenUsage, error)
RunTask implements providers.LLMProvider. It matches the task prompt against Responses and returns the canned reply. No real LLM calls are made; tools are never invoked. If chunkFn is non-nil, the reply is emitted word-by-word to simulate streaming.