lru

package module
v0.6.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 3, 2019 License: MPL-2.0 Imports: 3 Imported by: 6

README

golang-lru

GoDoc Go Report Card

This provides the lru package which implements a fixed-size thread safe LRU cache. It is based on the cache in Groupcache.

Changes from the original Hashicorp LRU Package

  • Addition of a new Probe() ethod to each cache implementation. This enables caller to construct and add an element into the cache if it is not present.
  • Renaming lru.Cache to lru.SimpleCache
  • Addition of a new interface lru.Cache to abstract the implementation of 3 concrete cache types. Callers can use this interface to be insulated from specific implementation details.

Example

Using the LRU is very simple:


import "github.com/opencoff/golang-lru"


func foo() {
    var c lru.Cache
    var err error

    c, err = lru.NewSimple(128)
    if err != nil {
        panic(err)
    }

    for i := 0; i < 256; i++ {
        l.Add(i, nil)
    }

    if l.Len() != 128 {
        panic(fmt.Sprintf("bad len: %v", l.Len()))
    }
}

Documentation

Overview

Package lru provides three different LRU caches of varying sophistication.

Cache is a simple LRU cache. It is based on the LRU implementation in groupcache: https://github.com/golang/groupcache/tree/master/lru

TwoQueueCache tracks frequently used and recently used entries separately. This avoids a burst of accesses from taking out frequently used entries, at the cost of about 2x computational overhead and some extra bookkeeping.

ARCCache is an adaptive replacement cache. It tracks recent evictions as well as recent usage in both the frequent and recent caches. Its computational overhead is comparable to TwoQueueCache, but the memory overhead is linear with the size of the cache.

ARC has been patented by IBM, so do not use it if that is problematic for your program.

All caches in this package take locks while operating, and are therefore thread-safe for consumers.

Index

Constants

View Source
const (
	// Default2QRecentRatio is the ratio of the 2Q cache dedicated
	// to recently added entries that have only been accessed once.
	Default2QRecentRatio = 0.25

	// Default2QGhostEntries is the default ratio of ghost
	// entries kept to track entries recently evicted
	Default2QGhostEntries = 0.50
)

Variables

This section is empty.

Functions

This section is empty.

Types

type ARCCache

type ARCCache struct {
	// contains filtered or unexported fields
}

ARCCache is a thread-safe fixed size Adaptive Replacement Cache (ARC). ARC is an enhancement over the standard LRU cache in that tracks both frequency and recency of use. This avoids a burst in access to new entries from evicting the frequently used older entries. It adds some additional tracking overhead to a standard LRU cache, computationally it is roughly 2x the cost, and the extra memory overhead is linear with the size of the cache. ARC has been patented by IBM, but is similar to the TwoQueueCache (2Q) which requires setting parameters.

func NewARC

func NewARC(size int) (*ARCCache, error)

NewARC creates an ARC of the given size

func (*ARCCache) Add

func (c *ARCCache) Add(key, value interface{})

Add adds a value to the cache.

func (*ARCCache) Contains

func (c *ARCCache) Contains(key interface{}) bool

Contains is used to check if the cache contains a key without updating recency or frequency.

func (*ARCCache) Get

func (c *ARCCache) Get(key interface{}) (value interface{}, ok bool)

Get looks up a key's value from the cache.

func (*ARCCache) Keys

func (c *ARCCache) Keys() []interface{}

Keys returns all the cached keys

func (*ARCCache) Len

func (c *ARCCache) Len() int

Len returns the number of cached entries

func (*ARCCache) Peek

func (c *ARCCache) Peek(key interface{}) (value interface{}, ok bool)

Peek is used to inspect the cache value of a key without updating recency or frequency.

func (*ARCCache) Probe

func (c *ARCCache) Probe(key interface{}, ctor func(key interface{}) interface{}) (interface{}, bool)

Probe returns an existing value and true if 'key' is found in the cache. Otherwise, it calls ctor() to create and insert a new element, and returns the newly created element. Probe returns false when the element is not found in the cache (and hence had to be inserted).

func (*ARCCache) Purge

func (c *ARCCache) Purge()

Purge is used to clear the cache

func (*ARCCache) Remove

func (c *ARCCache) Remove(key interface{})

Remove is used to purge a key from the cache

type Cache

type Cache interface {
	// Add adds a value to the cache.
	Add(key, val interface{})

	// Get returns (value, true) if key is in the cache and
	// (nil, false) otherwise. If key is found in the cache, depending
	// on the implementation, the cache may update the frequency or recency
	// of the key.
	Get(key interface{}) (value interface{}, ok bool)

	// Probe returns (value, true) if key is already in the cache.
	// Otherwise, it constructs the new value by calling ctor(key), and
	// inserts the new value into the cache; finally it returns the
	// newly constructed value and false (value, false).
	Probe(key interface{}, ctor func(key interface{}) interface{}) (value interface{}, ok bool)

	// Peek is similar to Get() except, it doesn't update the
	// recency or frequency.
	Peek(key interface{}) (value interface{}, ok bool)

	// Remove removes key from the cache if it exists.
	Remove(key interface{})

	// Contains returns true if the key is found in the cache and
	// false otherwise. It does this operation without updating recency
	// or frequency.
	Contains(key interface{}) bool

	// Len returns the number of items in the cache.
	Len() int

	// Keys returns a slice of the keys in the cache.
	// The frequently used keys are first in the returned slice.
	Keys() []interface{}

	// Purge is used to completely clear the cache.
	Purge()
}

Cache is an interface describing a generic LRU cache. It has 3 concrete implementations:

  • Simple LRU
  • 2 Queue
  • ARC (Adaptive Replacement Cache)

type SimpleCache

type SimpleCache struct {
	// contains filtered or unexported fields
}

SimpleCache is a thread-safe fixed size LRU cache.

func NewSimple

func NewSimple(size int) (*SimpleCache, error)

New creates an LRU of the given size.

func NewSimpleWithEvict

func NewSimpleWithEvict(size int, onEvicted func(key interface{}, value interface{})) (*SimpleCache, error)

NewWithEvict constructs a fixed size cache with the given eviction callback.

func (*SimpleCache) Add

func (c *SimpleCache) Add(key, value interface{})

Add adds a value to the cache. Returns true if an eviction occurred.

func (*SimpleCache) Contains

func (c *SimpleCache) Contains(key interface{}) bool

Contains checks if a key is in the cache, without updating the recent-ness or deleting it for being stale.

func (*SimpleCache) ContainsOrAdd

func (c *SimpleCache) ContainsOrAdd(key, value interface{}) (ok, evicted bool)

ContainsOrAdd checks if a key is in the cache without updating the recent-ness or deleting it for being stale, and if not, adds the value. Returns whether found and whether an eviction occurred.

func (*SimpleCache) Get

func (c *SimpleCache) Get(key interface{}) (value interface{}, ok bool)

Get looks up a key's value from the cache.

func (*SimpleCache) Keys

func (c *SimpleCache) Keys() []interface{}

Keys returns a slice of the keys in the cache, from oldest to newest.

func (*SimpleCache) Len

func (c *SimpleCache) Len() int

Len returns the number of items in the cache.

func (*SimpleCache) Peek

func (c *SimpleCache) Peek(key interface{}) (value interface{}, ok bool)

Peek returns the key value (or undefined if not found) without updating the "recently used"-ness of the key.

func (*SimpleCache) Probe

func (c *SimpleCache) Probe(key interface{}, ctor func(key interface{}) interface{}) (value interface{}, ok bool)

Probe adds 'val' if the key is NOT found in the cache and returns it. If key is in the cache, the corresponding value is returned. 'ok' is true is found in the cache and false otherwise.

func (*SimpleCache) Purge

func (c *SimpleCache) Purge()

Purge is used to completely clear the cache.

func (*SimpleCache) Remove

func (c *SimpleCache) Remove(key interface{})

Remove removes the provided key from the cache.

func (*SimpleCache) RemoveOldest

func (c *SimpleCache) RemoveOldest()

RemoveOldest removes the oldest item from the cache.

type TwoQueueCache

type TwoQueueCache struct {
	// contains filtered or unexported fields
}

TwoQueueCache is a thread-safe fixed size 2Q cache. 2Q is an enhancement over the standard LRU cache in that it tracks both frequently and recently used entries separately. This avoids a burst in access to new entries from evicting frequently used entries. It adds some additional tracking overhead to the standard LRU cache, and is computationally about 2x the cost, and adds some metadata over head. The ARCCache is similar, but does not require setting any parameters.

func New2Q

func New2Q(size int) (*TwoQueueCache, error)

New2Q creates a new TwoQueueCache using the default values for the parameters.

func New2QParams

func New2QParams(size int, recentRatio float64, ghostRatio float64) (*TwoQueueCache, error)

New2QParams creates a new TwoQueueCache using the provided parameter values.

func (*TwoQueueCache) Add

func (c *TwoQueueCache) Add(key, value interface{})

Add adds a value to the cache.

func (*TwoQueueCache) Contains

func (c *TwoQueueCache) Contains(key interface{}) bool

Contains is used to check if the cache contains a key without updating recency or frequency.

func (*TwoQueueCache) Get

func (c *TwoQueueCache) Get(key interface{}) (value interface{}, ok bool)

Get looks up a key's value from the cache.

func (*TwoQueueCache) Keys

func (c *TwoQueueCache) Keys() []interface{}

Keys returns a slice of the keys in the cache. The frequently used keys are first in the returned slice.

func (*TwoQueueCache) Len

func (c *TwoQueueCache) Len() int

Len returns the number of items in the cache.

func (*TwoQueueCache) Peek

func (c *TwoQueueCache) Peek(key interface{}) (value interface{}, ok bool)

Peek is used to inspect the cache value of a key without updating recency or frequency.

func (*TwoQueueCache) Probe

func (c *TwoQueueCache) Probe(key interface{}, ctor func(key interface{}) interface{}) (interface{}, bool)

Probe adds 'val' if the key is NOT found in the cache and returns it. If key is in the cache, the corresponding value is returned. 'ok' is true is found in the cache and false otherwise.

func (*TwoQueueCache) Purge

func (c *TwoQueueCache) Purge()

Purge is used to completely clear the cache.

func (*TwoQueueCache) Remove

func (c *TwoQueueCache) Remove(key interface{})

Remove removes the provided key from the cache.

Directories

Path Synopsis

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL