Documentation
¶
Overview ¶
Package graymatter provides persistent memory for Go AI agents.
Single static binary. Zero infra. Three public functions.
mem := graymatter.New(".graymatter")
mem.Remember("agent", "user prefers bullet points")
ctx := mem.Recall("agent", "how should I format this?")
// ctx is a []string ready to inject into a system prompt
Index ¶
- type Config
- type EmbeddingMode
- type Memory
- func (m *Memory) Close() error
- func (m *Memory) Config() Config
- func (m *Memory) Consolidate(ctx context.Context, agentID string) error
- func (m *Memory) Extract(ctx context.Context, llmResponse string) ([]string, error)
- func (m *Memory) Recall(agentID, query string) ([]string, error)
- func (m *Memory) RecallAll(agentID, query string) ([]string, error)
- func (m *Memory) RecallAllCtx(ctx context.Context, agentID, query string) ([]string, error)
- func (m *Memory) RecallCtx(ctx context.Context, agentID, query string) ([]string, error)
- func (m *Memory) RecallShared(query string) ([]string, error)
- func (m *Memory) RecallSharedCtx(ctx context.Context, query string) ([]string, error)
- func (m *Memory) Remember(agentID, text string) error
- func (m *Memory) RememberCtx(ctx context.Context, agentID, text string) error
- func (m *Memory) RememberExtracted(ctx context.Context, agentID, llmResponse string) error
- func (m *Memory) RememberShared(text string) error
- func (m *Memory) RememberSharedCtx(ctx context.Context, text string) error
- func (m *Memory) Store() *memory.Store
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
This section is empty.
Types ¶
type Config ¶
type Config struct {
// DataDir is the directory where gray.db and vector files are stored.
// Default: ".graymatter"
DataDir string
// TopK is the maximum number of facts returned by Recall.
// Default: 8
TopK int
// EmbeddingMode controls which embedding backend is used.
// Default: EmbeddingAuto (Ollama → OpenAI → Anthropic → keyword)
EmbeddingMode EmbeddingMode
// OllamaURL is the base URL of the Ollama API.
// Default: value of GRAYMATTER_OLLAMA_URL env var, or "http://localhost:11434"
OllamaURL string
// OllamaModel is the embedding model used with Ollama.
// Default: value of GRAYMATTER_OLLAMA_MODEL env var, or "nomic-embed-text"
OllamaModel string
// AnthropicAPIKey for the Anthropic embeddings and consolidation endpoints.
// Default: value of ANTHROPIC_API_KEY env var.
AnthropicAPIKey string
// OpenAIAPIKey for the OpenAI Embeddings API (text-embedding-3-small).
// Default: value of OPENAI_API_KEY env var.
OpenAIAPIKey string
// OpenAIModel overrides the OpenAI embedding model.
// Default: value of GRAYMATTER_OPENAI_MODEL env var, or "text-embedding-3-small"
OpenAIModel string
// ConsolidateLLM specifies which LLM provider drives memory consolidation.
// Values: "anthropic", "ollama", "" (disable consolidation).
// Default: "anthropic" if ANTHROPIC_API_KEY is set, else "" (disabled).
// To use Ollama as the consolidation LLM, set this field explicitly to "ollama".
ConsolidateLLM string
// ConsolidateModel is the model used for consolidation summarisation.
// Default: "claude-haiku-4-5-20251001"
ConsolidateModel string
// ConsolidateThreshold is the minimum fact count that triggers consolidation.
// Default: 100
ConsolidateThreshold int
// DecayHalfLife is the half-life for the exponential weight decay curve.
// Facts not accessed within this window lose half their retrieval weight.
// Default: 720h (30 days)
DecayHalfLife time.Duration
// AsyncConsolidate runs consolidation in a background goroutine after Remember.
// Default: true
AsyncConsolidate bool
// MaxAsyncConsolidations bounds how many consolidation goroutines may run
// concurrently. Additional triggers while at capacity are silently dropped.
// Default: 2
MaxAsyncConsolidations int
// OnConsolidateError is called when an async consolidation goroutine returns
// an error. If nil, errors are discarded. The callback must be safe for
// concurrent use.
OnConsolidateError func(agentID string, err error)
}
Config holds all GrayMatter configuration. All fields have sane defaults via DefaultConfig(). Zero-value Config is not valid — always call DefaultConfig().
func DefaultConfig ¶
func DefaultConfig() Config
DefaultConfig returns a Config with all defaults applied from environment variables and runtime probes. Safe to call multiple times.
func (Config) GetAnthropicAPIKey ¶
func (Config) GetConsolidateLLM ¶
func (Config) GetConsolidateModel ¶
func (Config) GetConsolidateThreshold ¶
func (Config) GetDecayHalfLife ¶
type EmbeddingMode ¶
type EmbeddingMode int
EmbeddingMode controls how GrayMatter generates vector embeddings.
const ( // EmbeddingAuto detects the best available provider at runtime. // Detection order: Ollama → OpenAI → Anthropic → keyword-only. EmbeddingAuto EmbeddingMode = iota // EmbeddingOllama forces Ollama (requires a running Ollama instance). EmbeddingOllama // EmbeddingAnthropic forces Anthropic API (requires ANTHROPIC_API_KEY). EmbeddingAnthropic // EmbeddingKeyword disables vector search; uses keyword+recency scoring only. EmbeddingKeyword // EmbeddingOpenAI forces OpenAI Embeddings API (requires OPENAI_API_KEY). EmbeddingOpenAI )
type Memory ¶
type Memory struct {
// contains filtered or unexported fields
}
Memory is the primary handle for GrayMatter operations. It is safe for concurrent use.
func New ¶
New creates a Memory with default configuration rooted at dataDir. If initialisation fails, it logs the error to stderr and returns a no-op Memory that never panics (callers need not check for nil).
func NewWithConfig ¶
NewWithConfig creates a Memory with explicit configuration. Returns an error if the data directory cannot be created or the database cannot be opened.
func (*Memory) Close ¶
Close flushes pending writes and closes the underlying database. Always call Close when done; failing to do so may leave gray.db locked.
func (*Memory) Consolidate ¶
Consolidate summarises and compacts memories for agentID. It calls the configured LLM to produce summary facts, applies the exponential decay curve, and prunes dead facts.
Consolidate is automatically triggered async after Remember when Config.AsyncConsolidate is true. Call it manually for synchronous control.
func (*Memory) Extract ¶
Extract calls the configured LLM and returns atomic facts distilled from llmResponse. Each returned string is a self-contained declarative sentence suitable for passing directly to Remember.
Requires an Anthropic API key. Without one, Extract returns the raw response as a single-element slice so the caller always receives a usable result.
facts, _ := mem.Extract(ctx, assistantReply)
for _, f := range facts {
mem.Remember("agent", f)
}
func (*Memory) Recall ¶
Recall returns the top-k most relevant facts for agentID given query. The returned []string is ready to be joined and injected into a system prompt.
ctx := mem.Recall("sales-closer", "follow up Maria")
systemPrompt += "\n\n## Memory\n" + strings.Join(ctx, "\n")
func (*Memory) RecallAll ¶
RecallAll merges agent-scoped and shared memory results for agentID, deduplicates, and returns at most TopK combined facts.
func (*Memory) RecallAllCtx ¶ added in v0.2.0
RecallAllCtx is the context-aware variant of RecallAll.
func (*Memory) RecallCtx ¶ added in v0.2.0
RecallCtx is the context-aware variant of Recall. Use this when you need timeout control or tracing propagation.
func (*Memory) RecallShared ¶
RecallShared returns the top-k most relevant shared facts for query.
func (*Memory) RecallSharedCtx ¶ added in v0.2.0
RecallSharedCtx is the context-aware variant of RecallShared.
func (*Memory) Remember ¶
Remember stores an observation associated with agentID. It is safe to call Remember concurrently from multiple goroutines.
mem.Remember("sales-closer", "Maria didn't reply Wednesday. Third touchpoint due Friday.")
func (*Memory) RememberCtx ¶ added in v0.2.0
RememberCtx is the context-aware variant of Remember. Use this when you need timeout control or tracing propagation.
func (*Memory) RememberExtracted ¶
RememberExtracted combines Extract and Remember in a single call: it extracts atomic facts from llmResponse and stores each one for agentID. This is the idiomatic replacement for the extractKeyFacts() pattern shown in the README.
mem.RememberExtracted(ctx, "sales-closer", assistantReply)
func (*Memory) RememberShared ¶
RememberShared stores an observation in the shared memory namespace, readable by all agents via RecallShared and RecallAll.
func (*Memory) RememberSharedCtx ¶ added in v0.2.0
RememberSharedCtx is the context-aware variant of RememberShared.
Directories
¶
| Path | Synopsis |
|---|---|
|
benchmarks
|
|
|
token_count
command
Command token_count benchmarks GrayMatter's token efficiency versus full-history injection.
|
Command token_count benchmarks GrayMatter's token efficiency versus full-history injection. |
|
cmd
|
|
|
graymatter
module
|
|
|
examples
|
|
|
agent
command
Package main shows the canonical GrayMatter integration pattern for a skill-based agent that calls the Anthropic Messages API.
|
Package main shows the canonical GrayMatter integration pattern for a skill-based agent that calls the Anthropic Messages API. |
|
plugin-hello
command
Command plugin-hello is a reference GrayMatter plugin that implements the hello_greet MCP tool.
|
Command plugin-hello is a reference GrayMatter plugin that implements the hello_greet MCP tool. |
|
standalone
command
Package main demonstrates GrayMatter with a bare Anthropic Messages API call.
|
Package main demonstrates GrayMatter with a bare Anthropic Messages API call. |
|
pkg
|
|
|
embedding
Package embedding provides pluggable vector embedding backends for GrayMatter.
|
Package embedding provides pluggable vector embedding backends for GrayMatter. |