Skip to content

Cache

Containers to implement caching.

These are used in Textual to avoid recalculating expensive operations, such as rendering.

FIFOCache class

def __init__(self, maxsize):

Bases: Generic[CacheKey, CacheValue]

A simple cache that discards the first added key when full (First In First Out).

This has a lower overhead than LRUCache, but won't manage a working set as efficiently. It is most suitable for a cache with a relatively low maximum size that is not expected to do many lookups.

Parameters
Parameter Default Description
maxsize
int
required

Maximum size of cache before discarding items.

clear method

def clear(self):

Clear the cache.

get method

def get(self, key, default=None):

Get a value from the cache, or return a default if the key is not present.

Parameters
Parameter Default Description
key
CacheKey
required

Key

default
DefaultValue | None
None

Default to return if key is not present.

Returns
Type Description
CacheValue | DefaultValue | None

Either the value or a default.

keys method

def keys(self):

Get cache keys.

set method

def set(self, key, value):

Set a value.

Parameters
Parameter Default Description
key
CacheKey
required

Key.

value
CacheValue
required

Value.

LRUCache class

def __init__(self, maxsize):

Bases: Generic[CacheKey, CacheValue]

A dictionary-like container with a maximum size.

If an additional item is added when the LRUCache is full, the least recently used key is discarded to make room for the new item.

The implementation is similar to functools.lru_cache, which uses a (doubly) linked list to keep track of the most recently used items.

Each entry is stored as [PREV, NEXT, KEY, VALUE] where PREV is a reference to the previous entry, and NEXT is a reference to the next value.

Note that stdlib's @lru_cache is implemented in C and faster! It's best to use @lru_cache where you are caching things that are fairly quick and called many times. Use LRUCache where you want increased flexibility and you are caching slow operations where the overhead of the cache is a small fraction of the total processing time.

Parameters
Parameter Default Description
maxsize
int
required

Maximum size of the cache, before old items are discarded.

maxsize property writable

maxsize: int

clear method

def clear(self):

Clear the cache.

discard method

def discard(self, key):

Discard item in cache from key.

Parameters
Parameter Default Description
key
CacheKey
required

Cache key.

get method

def get(self, key, default=None):

Get a value from the cache, or return a default if the key is not present.

Parameters
Parameter Default Description
key
CacheKey
required

Key

default
DefaultValue | None
None

Default to return if key is not present.

Returns
Type Description
CacheValue | DefaultValue | None

Either the value or a default.

grow method

def grow(self, maxsize):

Grow the maximum size to at least maxsize elements.

Parameters
Parameter Default Description
maxsize
int
required

New maximum size.

keys method

def keys(self):

Get cache keys.

set method

def set(self, key, value):

Set a value.

Parameters
Parameter Default Description
key
CacheKey
required

Key.

value
CacheValue
required

Value.