cache

package
v1.5.3 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 10, 2026 License: Apache-2.0 Imports: 10 Imported by: 0

README

Lynx Cache Package

High-performance in-memory caching solution based on Ristretto, providing thread-safe caching with automatic memory management and TTL support.

Features

  • High Performance: Built on Ristretto, one of the fastest caching libraries in Go
  • Thread-Safe: All operations are concurrent-safe
  • TTL Support: Set expiration time for cached items
  • Memory Management: Automatic eviction based on cost and frequency
  • Multiple Cache Instances: Manage multiple named caches through Manager
  • Fluent Builder API: Easy cache configuration with Builder pattern
  • Metrics Support: Built-in metrics collection for monitoring
  • Batch Operations: Support for bulk get/set/delete operations

Installation

go get github.com/dgraph-io/ristretto

Quick Start

Basic Usage
package main

import (
    "fmt"
    "time"
    "github.com/go-lynx/lynx/cache"
)

func main() {
    // Create a simple cache
    c, err := cache.QuickCache("my-cache")
    if err != nil {
        panic(err)
    }
    defer c.Close()

    // Set a value with TTL
    err = c.Set("key1", "value1", 5*time.Minute)
    if err != nil {
        panic(err)
    }

    // Get a value
    value, err := c.Get("key1")
    if err != nil {
        fmt.Println("Key not found")
    } else {
        fmt.Println("Value:", value)
    }

    // Check if key exists
    if c.Has("key1") {
        fmt.Println("Key exists")
    }

    // Delete a key
    c.Delete("key1")
}
Using Builder Pattern
// Create a custom cache with builder
cache, err := cache.NewBuilder("api-cache").
    WithMaxItems(10000).           // Max 10,000 items
    WithMaxMemory(1 << 28).        // Max 256MB memory
    WithMetrics(true).             // Enable metrics
    WithEvictionCallback(func(item *ristretto.Item) {
        fmt.Printf("Evicted: %v\n", item.Key)
    }).
    Build()
Using Cache Manager
// Create caches through manager
userCache, err := cache.Create("users", cache.DefaultOptions())
if err != nil {
    panic(err)
}

// Get existing cache
if c, exists := cache.Get("users"); exists {
    c.Set("user:1", userData, 10*time.Minute)
}

// Get or create cache
sessionCache, err := cache.GetOrCreate("sessions", &cache.Options{
    NumCounters: 1e5,     // 100K sessions
    MaxCost:     1 << 26, // 64MB
    BufferItems: 64,
})

// List all caches
cacheNames := cache.List()
fmt.Println("Active caches:", cacheNames)

// Get statistics
stats := cache.Stats()
for name, metrics := range stats {
    fmt.Printf("Cache %s - Hits: %d, Misses: %d\n", 
        name, metrics.Hits, metrics.Misses)
}

Advanced Features

GetOrSet Pattern
// Lazy loading with GetOrSet
value, err := cache.GetOrSet("expensive-key", func() (interface{}, error) {
    // This function is only called if key doesn't exist
    return expensiveOperation(), nil
}, 1*time.Hour)

// With context support
value, err := cache.GetOrSetContext(ctx, "api-key", 
    func(ctx context.Context) (interface{}, error) {
        return fetchFromAPI(ctx)
    }, 5*time.Minute)
Batch Operations
// Set multiple values
items := map[interface{}]interface{}{
    "key1": "value1",
    "key2": "value2",
    "key3": "value3",
}
err := cache.SetMulti(items, 10*time.Minute)

// Get multiple values
keys := []interface{}{"key1", "key2", "key3"}
values := cache.GetMulti(keys)
for k, v := range values {
    fmt.Printf("%v: %v\n", k, v)
}

// Delete multiple keys
cache.DeleteMulti(keys)
Custom Cost Function
cache, err := cache.NewBuilder("object-cache").
    WithMaxCost(1 << 30). // 1GB total cost
    WithCostFunction(func(value interface{}) int64 {
        // Calculate actual memory usage
        switch v := value.(type) {
        case string:
            return int64(len(v))
        case []byte:
            return int64(len(v))
        case User:
            return int64(unsafe.Sizeof(v)) + int64(len(v.Name))
        default:
            return 1
        }
    }).
    Build()

Preset Configurations

The package provides several preset cache configurations:

Small Cache
cache, err := cache.SmallCache("small")
// 10K items, 16MB memory
Medium Cache (Default)
cache, err := cache.QuickCache("medium")
// 1M items, 256MB memory
Large Cache
cache, err := cache.LargeCache("large")
// 100M items, 4GB memory, metrics enabled
Session Cache
cache, err := cache.SessionCacheBuilder("sessions", 30*time.Minute).
    BuildAndRegister()
// Optimized for session storage with TTL
API Cache
cache, err := cache.APICacheBuilder("api").
    BuildAndRegister()
// Optimized for API response caching

Best Practices

1. Choose Appropriate Size
  • Set NumCounters to 10x your expected unique items
  • Set MaxCost based on available memory
2. Use Cost Functions
  • Implement custom cost functions for complex objects
  • Accurately calculate memory usage for better eviction
3. Handle Cache Misses
value, err := cache.Get("key")
if err == cache.ErrCacheMiss {
    // Handle cache miss
    value = loadFromDatabase()
    cache.Set("key", value, 10*time.Minute)
}
4. Monitor Metrics
metrics := cache.Metrics()
hitRatio := float64(metrics.Hits) / float64(metrics.Hits + metrics.Misses)
fmt.Printf("Hit ratio: %.2f%%\n", hitRatio*100)
5. Graceful Shutdown
defer cache.Close() // Close individual cache
defer cache.DefaultManager.Close() // Close all caches

Performance Tips

  1. Buffer Size: Increase BufferItems for high-concurrency scenarios
  2. Metrics: Disable metrics in production if not needed
  3. TTL: Use appropriate TTL to balance memory usage and cache effectiveness
  4. Batch Operations: Use batch operations for multiple keys to reduce overhead

Error Handling

// Common errors
var (
    ErrCacheMiss  = errors.New("cache: key not found")
    ErrCacheSet   = errors.New("cache: failed to set value")
    ErrInvalidTTL = errors.New("cache: invalid TTL")
)

// Handle errors appropriately
if err := cache.Set("key", "value", -1*time.Second); err != nil {
    if err == cache.ErrInvalidTTL {
        // Handle invalid TTL
    }
}

Thread Safety

All cache operations are thread-safe and can be called concurrently:

var wg sync.WaitGroup
for i := 0; i < 100; i++ {
    wg.Add(1)
    go func(id int) {
        defer wg.Done()
        cache.Set(fmt.Sprintf("key%d", id), id, 5*time.Minute)
    }(i)
}
wg.Wait()

License

This package is part of the Lynx framework and is licensed under the Apache License 2.0.

Documentation

Index

Examples

Constants

This section is empty.

Variables

View Source
var (
	// ErrCacheMiss indicates that a key was not found in the cache
	ErrCacheMiss = errors.New("cache: key not found")
	// ErrCacheSet indicates that a value could not be set in the cache
	ErrCacheSet = errors.New("cache: failed to set value")
	// ErrInvalidTTL indicates that an invalid TTL was provided
	ErrInvalidTTL = errors.New("cache: invalid TTL")
)

Functions

func Clear

func Clear(name string) error

Clear clears a cache in the default manager

func ClearAll

func ClearAll()

ClearAll clears all caches in the default manager

func Close

func Close()

Close closes all caches in the default manager

func Delete

func Delete(name string) error

Delete removes a cache from the default manager

func GetOptimizerMetrics

func GetOptimizerMetrics() resource.CacheOptimizerMetrics

GetOptimizerMetrics gets optimizer metrics from the default manager

func List

func List() []string

List returns all cache names in the default manager

func Stats

func Stats() map[string]*ristretto.Metrics

Stats returns statistics for all caches in the default manager

Types

type APICache

type APICache struct {
	// contains filtered or unexported fields
}

APICache demonstrates caching for API responses

func NewAPICache

func NewAPICache() (*APICache, error)

NewAPICache creates a new API cache

func (*APICache) CacheResponse

func (a *APICache) CacheResponse(endpoint string, response interface{}, ttl time.Duration) error

CacheResponse caches an API response

func (*APICache) GetCachedResponse

func (a *APICache) GetCachedResponse(endpoint string, result interface{}) error

GetCachedResponse retrieves a cached API response

type Builder

type Builder struct {
	// contains filtered or unexported fields
}

Builder provides a fluent interface for building cache instances

Example
package main

import (
	"fmt"
	"log"
	"time"

	"github.com/dgraph-io/ristretto"
	"github.com/go-lynx/lynx/cache"
)

func main() {
	// Build a custom cache with specific settings
	c, err := cache.NewBuilder("custom").
		WithMaxItems(1000).     // Maximum 1000 items
		WithMaxMemory(1 << 26). // 64MB memory limit
		WithMetrics(true).      // Enable metrics
		WithEvictionCallback(func(item *ristretto.Item) {
			fmt.Printf("Evicted key: %v\n", item.Key)
		}).
		Build()

	if err != nil {
		log.Fatal(err)
	}
	defer c.Close()

	// Use the cache
	c.Set("test", "value", 1*time.Minute)
}

func APICacheBuilder

func APICacheBuilder(name string) *Builder

APICacheBuilder creates a builder for API response caches

func LargeCacheBuilder

func LargeCacheBuilder(name string) *Builder

LargeCacheBuilder creates a builder for large caches

func MediumCacheBuilder

func MediumCacheBuilder(name string) *Builder

MediumCacheBuilder creates a builder for medium caches

func NewBuilder

func NewBuilder(name string) *Builder

NewBuilder creates a new cache builder

func ObjectCacheBuilder

func ObjectCacheBuilder(name string) *Builder

ObjectCacheBuilder creates a builder for object caches with custom cost calculation

func SessionCacheBuilder

func SessionCacheBuilder(name string, sessionTTL time.Duration) *Builder

SessionCacheBuilder creates a builder for session caches

func SmallCacheBuilder

func SmallCacheBuilder(name string) *Builder

SmallCacheBuilder creates a builder for small caches

func (*Builder) Build

func (b *Builder) Build() (*Cache, error)

Build creates the cache instance

func (*Builder) BuildAndRegister

func (b *Builder) BuildAndRegister() (*Cache, error)

BuildAndRegister creates the cache and registers it with the default manager

func (*Builder) WithBufferItems

func (b *Builder) WithBufferItems(items int64) *Builder

WithBufferItems sets the buffer items for the cache

func (*Builder) WithCostFunction

func (b *Builder) WithCostFunction(fn func(value interface{}) int64) *Builder

WithCostFunction sets a custom cost calculation function

func (*Builder) WithEvictionCallback

func (b *Builder) WithEvictionCallback(fn func(item *ristretto.Item)) *Builder

WithEvictionCallback sets the eviction callback

func (*Builder) WithExitCallback

func (b *Builder) WithExitCallback(fn func(interface{})) *Builder

WithExitCallback sets the exit callback

func (*Builder) WithHashFunction

func (b *Builder) WithHashFunction(fn func(key interface{}) (uint64, uint64)) *Builder

WithHashFunction sets a custom hash function

func (*Builder) WithMaxCost

func (b *Builder) WithMaxCost(cost int64) *Builder

WithMaxCost sets the maximum cost for the cache

func (*Builder) WithMaxItems

func (b *Builder) WithMaxItems(items int64) *Builder

WithMaxItems sets the maximum number of items (approximation)

func (*Builder) WithMaxMemory

func (b *Builder) WithMaxMemory(bytes int64) *Builder

WithMaxMemory sets the maximum memory usage in bytes

func (*Builder) WithMetrics

func (b *Builder) WithMetrics(enabled bool) *Builder

WithMetrics enables or disables metrics collection

func (*Builder) WithNumCounters

func (b *Builder) WithNumCounters(num int64) *Builder

WithNumCounters sets the number of counters for the cache

func (*Builder) WithRejectionCallback

func (b *Builder) WithRejectionCallback(fn func(item *ristretto.Item)) *Builder

WithRejectionCallback sets the rejection callback

type Cache

type Cache struct {
	// contains filtered or unexported fields
}

Cache represents a thread-safe in-memory cache with TTL support

Example (Basic)
package main

import (
	"fmt"
	"log"
	"time"

	"github.com/go-lynx/lynx/cache"
)

func main() {
	// Create a simple cache
	c, err := cache.QuickCache("example")
	if err != nil {
		log.Fatal(err)
	}
	defer c.Close()

	// Set a value with 5 minute TTL
	err = c.Set("user:123", "John Doe", 5*time.Minute)
	if err != nil {
		log.Fatal(err)
	}

	// Get the value
	value, err := c.Get("user:123")
	if err != nil {
		fmt.Println("Not found")
	} else {
		fmt.Println(value)
	}

}
Output:

John Doe
Example (Batch)
package main

import (
	"fmt"
	"log"
	"time"

	"github.com/go-lynx/lynx/cache"
)

func main() {
	c, err := cache.QuickCache("batch-example")
	if err != nil {
		log.Fatal(err)
	}
	defer c.Close()

	// Set multiple values at once
	items := map[interface{}]interface{}{
		"key1": "value1",
		"key2": "value2",
		"key3": "value3",
	}
	err = c.SetMulti(items, 5*time.Minute)
	if err != nil {
		log.Fatal(err)
	}

	// Get multiple values at once
	keys := []interface{}{"key1", "key2", "key3", "key4"}
	values := c.GetMulti(keys)

	for _, key := range keys {
		if val, ok := values[key]; ok {
			fmt.Printf("%v: %v\n", key, val)
		} else {
			fmt.Printf("%v: not found\n", key)
		}
	}

}
Output:

key1: value1
key2: value2
key3: value3
key4: not found
Example (Concurrent)
package main

import (
	"fmt"
	"log"
	"sync"
	"time"

	"github.com/go-lynx/lynx/cache"
)

func main() {
	c, err := cache.QuickCache("concurrent")
	if err != nil {
		log.Fatal(err)
	}
	defer c.Close()

	var wg sync.WaitGroup

	// Concurrent writes
	for i := 0; i < 100; i++ {
		wg.Add(1)
		go func(id int) {
			defer wg.Done()
			key := fmt.Sprintf("key:%d", id)
			value := fmt.Sprintf("value:%d", id)
			c.Set(key, value, 5*time.Minute)
		}(i)
	}

	// Concurrent reads
	for i := 0; i < 100; i++ {
		wg.Add(1)
		go func(id int) {
			defer wg.Done()
			key := fmt.Sprintf("key:%d", id)
			c.Get(key)
		}(i)
	}

	wg.Wait()
	fmt.Println("Concurrent operations completed")

}
Output:

Concurrent operations completed

func Create

func Create(name string, opts *Options) (*Cache, error)

Create creates a new cache in the default manager

func Get

func Get(name string) (*Cache, bool)

Get retrieves a cache from the default manager

func GetOrCreate

func GetOrCreate(name string, opts *Options) (*Cache, error)

GetOrCreate retrieves or creates a cache in the default manager

func LargeCache

func LargeCache(name string) (*Cache, error)

LargeCache creates a large cache suitable for big data

func New

func New(name string, opts *Options) (*Cache, error)

New creates a new cache instance with the given options

func QuickCache

func QuickCache(name string) (*Cache, error)

QuickCache creates a simple cache with default settings

func SmallCache

func SmallCache(name string) (*Cache, error)

SmallCache creates a small cache suitable for limited data

func TTLCache

func TTLCache(name string, defaultTTL time.Duration) (*Cache, error)

TTLCache creates a cache optimized for TTL-based eviction

func (*Cache) Clear

func (c *Cache) Clear()

Clear removes all items from the cache

func (*Cache) Close

func (c *Cache) Close()

Close gracefully shuts down the cache

func (*Cache) Delete

func (c *Cache) Delete(key interface{})

Delete removes a key from the cache

func (*Cache) DeleteMulti

func (c *Cache) DeleteMulti(keys []interface{})

DeleteMulti removes multiple keys from the cache

func (*Cache) Get

func (c *Cache) Get(key interface{}) (interface{}, error)

Get retrieves a value from the cache by key

func (*Cache) GetMulti

func (c *Cache) GetMulti(keys []interface{}) map[interface{}]interface{}

GetMulti retrieves multiple values from the cache

func (*Cache) GetOrSet

func (c *Cache) GetOrSet(key interface{}, fn func() (interface{}, error), ttl time.Duration) (interface{}, error)

GetOrSet retrieves a value from the cache or sets it if not found

Example
package main

import (
	"fmt"
	"log"
	"time"

	"github.com/go-lynx/lynx/cache"
)

func main() {
	c, err := cache.QuickCache("lazy-load")
	if err != nil {
		log.Fatal(err)
	}
	defer c.Close()

	// This function is only called if the key doesn't exist
	value, err := c.GetOrSet("expensive-data", func() (interface{}, error) {
		// Simulate expensive operation
		time.Sleep(100 * time.Millisecond)
		return "computed value", nil
	}, 10*time.Minute)

	if err != nil {
		log.Fatal(err)
	}
	fmt.Println(value)

	// Second call will get from cache (no sleep)
	value, _ = c.GetOrSet("expensive-data", func() (interface{}, error) {
		return "this won't be called", nil
	}, 10*time.Minute)
	fmt.Println(value)

}
Output:

computed value
computed value

func (*Cache) GetOrSetContext

func (c *Cache) GetOrSetContext(ctx context.Context, key interface{}, fn func(context.Context) (interface{}, error), ttl time.Duration) (interface{}, error)

GetOrSetContext is like GetOrSet but with context support

Example
package main

import (
	"context"
	"fmt"
	"log"
	"time"

	"github.com/go-lynx/lynx/cache"
)

func main() {
	c, err := cache.QuickCache("context-example")
	if err != nil {
		log.Fatal(err)
	}
	defer c.Close()

	ctx, cancel := context.WithTimeout(context.Background(), 2*time.Second)
	defer cancel()

	// Fetch with context support
	value, err := c.GetOrSetContext(ctx, "api-data",
		func(ctx context.Context) (interface{}, error) {
			// Simulate API call
			select {
			case <-time.After(1 * time.Second):
				return "api response", nil
			case <-ctx.Done():
				return nil, ctx.Err()
			}
		}, 5*time.Minute)

	if err != nil {
		log.Fatal(err)
	}
	fmt.Println(value)

}
Output:

api response

func (*Cache) GetWithExpiration

func (c *Cache) GetWithExpiration(key interface{}) (interface{}, bool)

GetWithExpiration retrieves a value and checks if it exists

func (*Cache) Has

func (c *Cache) Has(key interface{}) bool

Has checks if a key exists in the cache

func (*Cache) Metrics

func (c *Cache) Metrics() *ristretto.Metrics

Metrics returns cache statistics

func (*Cache) Name

func (c *Cache) Name() string

Name returns the cache name

func (*Cache) Set

func (c *Cache) Set(key interface{}, value interface{}, ttl time.Duration) error

Set stores a key-value pair in the cache with the given TTL

func (*Cache) SetMulti

func (c *Cache) SetMulti(items map[interface{}]interface{}, ttl time.Duration) error

SetMulti stores multiple key-value pairs in the cache

func (*Cache) SetWithCost

func (c *Cache) SetWithCost(key interface{}, value interface{}, cost int64, ttl time.Duration) error

SetWithCost stores a key-value pair with a custom cost

type Config

type Config struct {
	MaxSize int
	TTL     time.Duration
}

Config cache configuration

type Manager

type Manager struct {
	// contains filtered or unexported fields
}

Manager manages multiple cache instances

Example
package main

import (
	"fmt"
	"log"
	"time"

	"github.com/go-lynx/lynx/cache"
)

func main() {
	// Create different caches for different purposes
	userCache, err := cache.Create("users", cache.DefaultOptions())
	if err != nil {
		log.Fatal(err)
	}

	sessionCache, err := cache.SessionCacheBuilder("sessions", 30*time.Minute).
		BuildAndRegister()
	if err != nil {
		log.Fatal(err)
	}

	apiCache, err := cache.APICacheBuilder("api-responses").
		BuildAndRegister()
	if err != nil {
		log.Fatal(err)
	}

	// Use the caches
	userCache.Set("user:1", "Alice", 10*time.Minute)
	sessionCache.Set("session:xyz", "session-data", 30*time.Minute)
	apiCache.Set("/api/users", "cached-response", 1*time.Minute)

	// List all caches
	fmt.Println("Active caches:", cache.List())

	// Get statistics
	stats := cache.Stats()
	for name, metrics := range stats {
		if metrics != nil {
			fmt.Printf("Cache %s - Hits: %d, Misses: %d\n",
				name, metrics.Hits(), metrics.Misses())
		}
	}

	// Clean up
	cache.Close()
}
var DefaultManager *Manager

DefaultManager is the global cache manager instance

func NewManager

func NewManager() *Manager

NewManager creates a new cache manager with a default logger.

func NewManagerWithLogger

func NewManagerWithLogger(logger zerolog.Logger) *Manager

NewManagerWithLogger creates a new cache manager using the provided logger.

func (*Manager) Clear

func (m *Manager) Clear(name string) error

Clear clears all items in a specific cache

func (*Manager) ClearAll

func (m *Manager) ClearAll()

ClearAll clears all items in all caches

func (*Manager) Close

func (m *Manager) Close() error

Close closes all cache instances

func (*Manager) Create

func (m *Manager) Create(name string, opts *Options) (*Cache, error)

Create creates a new cache instance with the given name and options

func (*Manager) CreateOptimized

func (m *Manager) CreateOptimized(name string, config Config) (*OptimizedCacheWrapper, error)

CreateOptimized creates an optimized cache instance

func (*Manager) Delete

func (m *Manager) Delete(name string) error

Delete removes a cache instance

func (*Manager) Get

func (m *Manager) Get(name string) (*Cache, bool)

Get retrieves a cache instance by name

func (*Manager) GetOptimizerMetrics

func (m *Manager) GetOptimizerMetrics() resource.CacheOptimizerMetrics

GetOptimizerMetrics gets optimizer metrics

func (*Manager) GetOrCreate

func (m *Manager) GetOrCreate(name string, opts *Options) (*Cache, error)

GetOrCreate retrieves an existing cache or creates a new one

func (*Manager) List

func (m *Manager) List() []string

List returns a list of all cache names

func (*Manager) Stats

func (m *Manager) Stats() map[string]*ristretto.Metrics

Stats returns statistics for all caches

type OptimizedCacheWrapper

type OptimizedCacheWrapper struct {
	// contains filtered or unexported fields
}

OptimizedCacheWrapper optimized cache wrapper

func CreateOptimizedCache

func CreateOptimizedCache(name string, maxSize int, ttl time.Duration) (*OptimizedCacheWrapper, error)

CreateOptimizedCache creates an optimized cache (convenience function)

func (*OptimizedCacheWrapper) Clear

func (w *OptimizedCacheWrapper) Clear()

func (*OptimizedCacheWrapper) Close

func (w *OptimizedCacheWrapper) Close()

func (*OptimizedCacheWrapper) Delete

func (w *OptimizedCacheWrapper) Delete(key interface{})

func (*OptimizedCacheWrapper) DeleteMulti

func (w *OptimizedCacheWrapper) DeleteMulti(keys []interface{})

func (*OptimizedCacheWrapper) Get

func (w *OptimizedCacheWrapper) Get(key interface{}) (interface{}, error)

func (*OptimizedCacheWrapper) GetMulti

func (w *OptimizedCacheWrapper) GetMulti(keys []interface{}) map[interface{}]interface{}

func (*OptimizedCacheWrapper) GetOrSet

func (w *OptimizedCacheWrapper) GetOrSet(key interface{}, fn func() (interface{}, error), ttl time.Duration) (interface{}, error)

func (*OptimizedCacheWrapper) GetWithExpiration

func (w *OptimizedCacheWrapper) GetWithExpiration(key interface{}) (interface{}, bool)

func (*OptimizedCacheWrapper) Has

func (w *OptimizedCacheWrapper) Has(key interface{}) bool

func (*OptimizedCacheWrapper) Metrics

func (w *OptimizedCacheWrapper) Metrics() *ristretto.Metrics

func (*OptimizedCacheWrapper) Name

func (w *OptimizedCacheWrapper) Name() string

func (*OptimizedCacheWrapper) Set

func (w *OptimizedCacheWrapper) Set(key interface{}, value interface{}, ttl time.Duration) error

func (*OptimizedCacheWrapper) SetMulti

func (w *OptimizedCacheWrapper) SetMulti(items map[interface{}]interface{}, ttl time.Duration) error

func (*OptimizedCacheWrapper) SetWithCost

func (w *OptimizedCacheWrapper) SetWithCost(key interface{}, value interface{}, cost int64, ttl time.Duration) error

type Options

type Options struct {
	// NumCounters is the number of 4-bit counters for admission policy (10x max items)
	NumCounters int64
	// MaxCost is the maximum cost of cache (sum of all items' costs)
	MaxCost int64
	// BufferItems is the number of keys per Get buffer
	BufferItems int64
	// Metrics enables cache metrics collection
	Metrics bool
	// OnEvict is called when an item is evicted from the cache
	OnEvict func(item *ristretto.Item)
	// OnReject is called when an item is rejected from the cache
	OnReject func(item *ristretto.Item)
	// OnExit is called when cache.Close() is called
	OnExit func(interface{})
	// KeyToHash is a custom hash function for keys
	KeyToHash func(key interface{}) (uint64, uint64)
	// Cost is a function to calculate the cost of a value
	Cost func(value interface{}) int64
}

Options represents cache configuration options

func DefaultOptions

func DefaultOptions() *Options

DefaultOptions returns default cache options

type Session

type Session struct {
	ID        string                 `json:"id"`
	UserID    string                 `json:"user_id"`
	Data      map[string]interface{} `json:"data"`
	ExpiresAt time.Time              `json:"expires_at"`
}

Session represents a user session

type SessionCache

type SessionCache struct {
	// contains filtered or unexported fields
}

SessionCache demonstrates session management with cache

func NewSessionCache

func NewSessionCache(sessionTTL time.Duration) (*SessionCache, error)

NewSessionCache creates a new session cache

func (*SessionCache) CreateSession

func (s *SessionCache) CreateSession(userID string, ttl time.Duration) (*Session, error)

CreateSession creates a new session

func (*SessionCache) DeleteSession

func (s *SessionCache) DeleteSession(sessionID string)

DeleteSession removes a session

func (*SessionCache) GetSession

func (s *SessionCache) GetSession(sessionID string) (*Session, error)

GetSession retrieves a session

func (*SessionCache) UpdateSession

func (s *SessionCache) UpdateSession(session *Session) error

UpdateSession updates session data

type User

type User struct {
	ID        string    `json:"id"`
	Name      string    `json:"name"`
	Email     string    `json:"email"`
	Role      string    `json:"role"`
	CreatedAt time.Time `json:"created_at"`
}

User represents a user entity

type UserService

type UserService struct {
	// contains filtered or unexported fields
}

UserService demonstrates cache usage in a service layer

func NewUserService

func NewUserService() (*UserService, error)

NewUserService creates a new user service with cache

func (*UserService) Close

func (s *UserService) Close()

Close closes the cache

func (*UserService) GetCacheStats

func (s *UserService) GetCacheStats() string

GetCacheStats returns cache statistics

func (*UserService) GetMultipleUsers

func (s *UserService) GetMultipleUsers(ctx context.Context, userIDs []string) (map[string]*User, error)

GetMultipleUsers demonstrates batch get

func (*UserService) GetUser

func (s *UserService) GetUser(ctx context.Context, userID string) (*User, error)

GetUser retrieves a user with caching

func (*UserService) GetUsersByRole

func (s *UserService) GetUsersByRole(ctx context.Context, role string) ([]*User, error)

GetUsersByRole demonstrates batch caching

func (*UserService) InvalidateUserCache

func (s *UserService) InvalidateUserCache()

InvalidateUserCache clears all user-related caches

func (*UserService) UpdateUser

func (s *UserService) UpdateUser(ctx context.Context, user *User) error

UpdateUser updates a user and invalidates cache

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL