lru

package
v0.0.0-...-d88c8b5 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 9, 2021 License: Apache-2.0 Imports: 3 Imported by: 0

Documentation

Overview

Package lru provides three different LRU caches of varying sophistication.

Cache is a simple LRU cache. It is based on the LRU implementation in groupcache: https://github.com/golang/groupcache/tree/master/lru

TwoQueueCache tracks frequently used and recently used entries separately. This avoids a burst of accesses from taking out frequently used entries, at the cost of about 2x computational overhead and some extra bookkeeping.

ARCCache is an adaptive replacement cache. It tracks recent evictions as well as recent usage in both the frequent and recent caches. Its computational overhead is comparable to TwoQueueCache, but the memory overhead is linear with the size of the cache.

ARC has been patented by IBM, so do not use it if that is problematic for your program.

All caches in this package take locks while operating, and are therefore thread-safe for consumers.

Index

Constants

View Source
const (
	// Default2QRecentRatio is the ratio of the 2Q cache dedicated
	// to recently added entries that have only been accessed once.
	Default2QRecentRatio = 0.25

	// Default2QGhostEntries is the default ratio of ghost
	// entries kept to track entries recently evicted
	Default2QGhostEntries = 0.50
)

Variables

This section is empty.

Functions

This section is empty.

Types

type ARCCache

type ARCCache struct {
	// contains filtered or unexported fields
}

ARCCache is a thread-safe fixed size Adaptive Replacement Cache (ARC). ARC is an enhancement over the standard LRU cache in that tracks both frequency and recency of use. This avoids a burst in access to new entries from evicting the frequently used older entries. It adds some additional tracking overhead to a standard LRU cache, computationally it is roughly 2x the cost, and the extra memory overhead is linear with the size of the cache. ARC has been patented by IBM, but is similar to the TwoQueueCache (2Q) which requires setting parameters.

func NewARC

func NewARC(size int) (*ARCCache, error)

NewARC creates an ARC of the given size

func (*ARCCache) Add

func (c *ARCCache) Add(key, value interface{})

Add adds a value to the cache.

func (*ARCCache) Contains

func (c *ARCCache) Contains(key interface{}) bool

Contains is used to check if the cache contains a key without updating recency or frequency.

func (*ARCCache) Get

func (c *ARCCache) Get(key interface{}) (value interface{}, ok bool)

Get looks up a key's value from the cache.

func (*ARCCache) Keys

func (c *ARCCache) Keys() []interface{}

Keys returns all the cached keys

func (*ARCCache) Len

func (c *ARCCache) Len() int

Len returns the number of cached entries

func (*ARCCache) Peek

func (c *ARCCache) Peek(key interface{}) (value interface{}, ok bool)

Peek is used to inspect the cache value of a key without updating recency or frequency.

func (*ARCCache) Purge

func (c *ARCCache) Purge()

Purge is used to clear the cache

func (*ARCCache) Remove

func (c *ARCCache) Remove(key interface{})

Remove is used to purge a key from the cache

type Cache

type Cache struct {
	// contains filtered or unexported fields
}

Cache is a thread-safe fixed size LRU cache.

func New

func New(size int) (*Cache, error)

New creates an LRU of the given size.

func NewWithEvict

func NewWithEvict(size int, onEvicted func(key interface{}, value interface{})) (*Cache, error)

NewWithEvict constructs a fixed size cache with the given eviction callback.

func (*Cache) Add

func (c *Cache) Add(key, value interface{}) (evicted bool)

Add adds a value to the cache. Returns true if an eviction occurred.

func (*Cache) Contains

func (c *Cache) Contains(key interface{}) bool

Contains checks if a key is in the cache, without updating the recent-ness or deleting it for being stale.

func (*Cache) ContainsOrAdd

func (c *Cache) ContainsOrAdd(key, value interface{}) (ok, evicted bool)

ContainsOrAdd checks if a key is in the cache without updating the recent-ness or deleting it for being stale, and if not, adds the value. Returns whether found and whether an eviction occurred.

func (*Cache) Get

func (c *Cache) Get(key interface{}) (value interface{}, ok bool)

Get looks up a key's value from the cache.

func (*Cache) Keys

func (c *Cache) Keys() []interface{}

Keys returns a slice of the keys in the cache, from oldest to newest.

func (*Cache) Len

func (c *Cache) Len() int

Len returns the number of items in the cache.

func (*Cache) Peek

func (c *Cache) Peek(key interface{}) (value interface{}, ok bool)

Peek returns the key value (or undefined if not found) without updating the "recently used"-ness of the key.

func (*Cache) Purge

func (c *Cache) Purge()

Purge is used to completely clear the cache.

func (*Cache) Remove

func (c *Cache) Remove(key interface{})

Remove removes the provided key from the cache.

func (*Cache) RemoveOldest

func (c *Cache) RemoveOldest()

RemoveOldest removes the oldest item from the cache.

type TwoQueueCache

type TwoQueueCache struct {
	// contains filtered or unexported fields
}

TwoQueueCache is a thread-safe fixed size 2Q cache. 2Q is an enhancement over the standard LRU cache in that it tracks both frequently and recently used entries separately. This avoids a burst in access to new entries from evicting frequently used entries. It adds some additional tracking overhead to the standard LRU cache, and is computationally about 2x the cost, and adds some metadata over head. The ARCCache is similar, but does not require setting any parameters.

func New2Q

func New2Q(size int) (*TwoQueueCache, error)

New2Q creates a new TwoQueueCache using the default values for the parameters.

func New2QParams

func New2QParams(size int, recentRatio float64, ghostRatio float64) (*TwoQueueCache, error)

New2QParams creates a new TwoQueueCache using the provided parameter values.

func (*TwoQueueCache) Add

func (c *TwoQueueCache) Add(key, value interface{})

Add adds a value to the cache.

func (*TwoQueueCache) Contains

func (c *TwoQueueCache) Contains(key interface{}) bool

Contains is used to check if the cache contains a key without updating recency or frequency.

func (*TwoQueueCache) Get

func (c *TwoQueueCache) Get(key interface{}) (value interface{}, ok bool)

Get looks up a key's value from the cache.

func (*TwoQueueCache) Keys

func (c *TwoQueueCache) Keys() []interface{}

Keys returns a slice of the keys in the cache. The frequently used keys are first in the returned slice.

func (*TwoQueueCache) Len

func (c *TwoQueueCache) Len() int

Len returns the number of items in the cache.

func (*TwoQueueCache) Peek

func (c *TwoQueueCache) Peek(key interface{}) (value interface{}, ok bool)

Peek is used to inspect the cache value of a key without updating recency or frequency.

func (*TwoQueueCache) Purge

func (c *TwoQueueCache) Purge()

Purge is used to completely clear the cache.

func (*TwoQueueCache) Remove

func (c *TwoQueueCache) Remove(key interface{})

Remove removes the provided key from the cache.

Directories

Path Synopsis

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL