memo

package
v0.42.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 17, 2026 License: MIT Imports: 2 Imported by: 0

README

memo

Memoize function results with concurrent-safe caches.

Use memo when you want to cache successful results by input. For lazy sequences with shared evaluation, use stream. For eager collection transforms, use slice.

// Before: manual cache with lock choreography
var (
    mu    sync.RWMutex
    cache = make(map[string]Result)
)
func lookup(key string) Result {
    mu.RLock()
    if v, ok := cache[key]; ok {
        mu.RUnlock()
        return v
    }
    mu.RUnlock()
    v := expensiveLookup(key)
    mu.Lock()
    cache[key] = v
    mu.Unlock()
    return v
}

// After: one line — concurrent-safe, but see Concurrency note below
lookup := memo.Fn(expensiveLookup)

What It Looks Like

// Lazy initialization — computed once, cached permanently
getConfig := memo.Of(loadConfig)
cfg := getConfig()  // computes
cfg = getConfig()   // returns cached result
// Fallible function — errors retry, successes cached
fetch := memo.FnErr(callAPI)
v, err := fetch("key")  // calls API
v, err = fetch("key")   // cached (if first call succeeded)
// Bounded cache — LRU eviction at capacity
lookup := memo.FnWith(expensiveLookup, memo.NewLRU[string, Result](1000))

Concurrency Semantics

Of guarantees single evaluation. Concurrent callers wait for the in-flight evaluation — only one goroutine computes the rslt. This is true single-flight behavior.

Keyed wrappers (Fn, FnWith, FnErr, FnErrWith) do not coalesce concurrent misses. The cache is concurrent-safe, but multiple goroutines requesting the same uncached key may all compute it simultaneously. If deduplication matters (expensive API calls, stampede-prone workloads), use golang.org/x/sync/singleflight or add external coordination.

Retry on Panic, Retry on Error

Of retries panicked functions. Unlike sync.Once, which poisons permanently on panic, Of resets to pending — subsequent calls retry the function. If initialization panics due to a transient condition (network, file not ready), the next call gets another chance.

FnErr retries errors. Only successful results are cached. If the underlying function returns an error, subsequent calls retry rather than returning the cached error. This matches the assumption that errors are transient — if you need to cache errors, use Fn with rslt.Result[V] as the value type.

Of: successes are permanent. Once Of's function returns successfully, the result is cached and the original function is released for GC. Keyed wrappers retain fn for future uncached keys.

Operations

Memoize

  • Of[T any](fn func() T) func() T — zero-arg; single-flight; retry-on-panic
  • Fn[K comparable, V any](fn func(K) V) func(K) V — keyed; unbounded cache
  • FnErr[K comparable, V any](fn func(K) (V, error)) func(K) (V, error) — keyed; caches successes only
  • FnWith[K comparable, V any](fn func(K) V, cache Cache[K, V]) func(K) V — keyed; custom cache
  • FnErrWith[K comparable, V any](fn func(K) (V, error), cache Cache[K, V]) func(K) (V, error) — keyed; custom cache, caches successes only

Caches

  • Cache[K comparable, V any] — interface: Load(K) (V, bool), Store(K, V)
  • NewMap[K comparable, V any]() Cache[K, V] — unbounded, concurrent-safe (sync.RWMutex)
  • NewLRU[K comparable, V any](capacity int) Cache[K, V] — LRU eviction, concurrent-safe (sync.Mutex)

All functions panic on nil inputs (fn, cache). NewLRU panics on non-positive capacity. K must be comparable (map key constraint).

See pkg.go.dev for complete API documentation and the main README for installation.

Documentation

Overview

Package memo provides memoization primitives: lazy zero-arg evaluation, keyed function caching, and pluggable cache strategies. All primitives are concurrent-safe. Of uses retry-on-panic semantics (matching stream's lazy evaluation). FnErr caches successes only — errors trigger retry on subsequent calls.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

func Fn

func Fn[K comparable, V any](fn func(K) V) func(K) V

Fn wraps a single-arg function with an unbounded map cache. Cache-backed, not single-flight: concurrent misses for the same key may compute the value multiple times; the last store wins. Thread-safe. Panics if fn is nil.

func FnErr

func FnErr[K comparable, V any](fn func(K) (V, error)) func(K) (V, error)

FnErr wraps a fallible single-arg function with an unbounded map cache. Only successful results are cached — errors trigger retry on subsequent calls. Cache-backed, not single-flight: concurrent misses for the same key may compute the value multiple times; the last store wins. Thread-safe. Panics if fn is nil.

func FnErrWith

func FnErrWith[K comparable, V any](fn func(K) (V, error), cache Cache[K, V]) func(K) (V, error)

FnErrWith wraps a fallible single-arg function with a caller-provided cache. Only successful results are cached — errors trigger retry on subsequent calls. Cache-backed, not single-flight: concurrent misses for the same key may compute the value multiple times; the last store wins. Thread-safe (cache must handle its own synchronization). Panics if fn or cache is nil.

func FnWith

func FnWith[K comparable, V any](fn func(K) V, cache Cache[K, V]) func(K) V

FnWith wraps a single-arg function with a caller-provided cache. Cache-backed, not single-flight: concurrent misses for the same key may compute the value multiple times; the last store wins. Thread-safe (cache must handle its own synchronization). Panics if fn or cache is nil.

func Of

func Of[T any](fn func() T) func() T

Of wraps a zero-arg function so it executes at most once on success. The result is cached and returned on subsequent calls. Thread-safe.

If fn panics, the cell resets to pending — subsequent calls retry the function. This differs from sync.Once, which poisons permanently on panic.

Reentrancy constraint: fn must not call the returned memoized function, directly or indirectly — this would deadlock on the internal mutex.

Panics if fn is nil.

Types

type Cache

type Cache[K comparable, V any] interface {
	Load(key K) (V, bool)
	Store(key K, value V)
}

Cache is a thread-safe key-value store for memoized results. Implementations must handle their own synchronization.

func NewLRU

func NewLRU[K comparable, V any](capacity int) Cache[K, V]

NewLRU returns a concurrent-safe LRU cache that evicts the least recently used entry when capacity is exceeded. Panics if capacity <= 0.

func NewMap

func NewMap[K comparable, V any]() Cache[K, V]

NewMap returns an unbounded, concurrent-safe cache.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL