Documentation
¶
Overview ¶
Package optimizer uses an OpenAI-compatible LLM endpoint to merge, deduplicate, and consolidate multiple instruction files into a single coherent output.
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func ConcatRaw ¶
func ConcatRaw(instructions []ContentInput) string
ConcatRaw is the fallback when LLM is not available — simple concatenation.
Types ¶
type ContentInput ¶
ContentInput is the input to the optimizer.
type LLMConfig ¶
type LLMConfig struct {
// Endpoint is the base URL (e.g. "https://api.openai.com/v1").
Endpoint string `yaml:"endpoint"`
// Model is the model name (e.g. "gpt-4o-mini").
Model string `yaml:"model"`
// APIKey is the bearer token.
APIKey string `yaml:"-"`
// Enabled controls whether LLM optimization is on by default.
Enabled bool `yaml:"enabled"`
}
LLMConfig configures the OpenAI-compatible LLM used to consolidate content.
type Optimizer ¶
type Optimizer struct {
// contains filtered or unexported fields
}
Optimizer calls an OpenAI-compatible chat completions endpoint to consolidate content.
func New ¶
New creates an Optimizer. Returns nil if LLM is not configured. An optional *http.Client can be passed to control proxy/TLS behavior; if nil, a default client with a 60 s timeout is used.
Click to show internal directories.
Click to hide internal directories.