Skip to content

textual.cache

Cache classes are dict-like containers used to avoid recalculating expensive operations such as rendering.

You can also use them in your own apps for similar reasons.

FIFOCache

FIFOCache(maxsize)

Bases: Generic[CacheKey, CacheValue]

A simple cache that discards the first added key when full (First In First Out).

This has a lower overhead than LRUCache, but won't manage a working set as efficiently. It is most suitable for a cache with a relatively low maximum size that is not expected to do many lookups.

Parameters:

Name Type Description Default

maxsize

int

Maximum size of cache before discarding items.

required

clear

clear()

Clear the cache.

get

get(key: CacheKey) -> CacheValue | None
get(
    key: CacheKey, default: DefaultValue
) -> CacheValue | DefaultValue
get(key, default=None)

Get a value from the cache, or return a default if the key is not present.

Parameters:

Name Type Description Default

key

CacheKey

Key

required

default

DefaultValue | None

Default to return if key is not present.

None

Returns:

Type Description
CacheValue | DefaultValue | None

Either the value or a default.

keys

keys()

Get cache keys.

set

set(key, value)

Set a value.

Parameters:

Name Type Description Default

key

CacheKey

Key.

required

value

CacheValue

Value.

required

LRUCache

LRUCache(maxsize)

Bases: Generic[CacheKey, CacheValue]

A dictionary-like container with a maximum size.

If an additional item is added when the LRUCache is full, the least recently used key is discarded to make room for the new item.

The implementation is similar to functools.lru_cache, which uses a (doubly) linked list to keep track of the most recently used items.

Each entry is stored as [PREV, NEXT, KEY, VALUE] where PREV is a reference to the previous entry, and NEXT is a reference to the next value.

Note that stdlib's @lru_cache is implemented in C and faster! It's best to use @lru_cache where you are caching things that are fairly quick and called many times. Use LRUCache where you want increased flexibility and you are caching slow operations where the overhead of the cache is a small fraction of the total processing time.

Parameters:

Name Type Description Default

maxsize

int

Maximum size of the cache, before old items are discarded.

required

maxsize property writable

maxsize

clear

clear()

Clear the cache.

discard

discard(key)

Discard item in cache from key.

Parameters:

Name Type Description Default

key

CacheKey

Cache key.

required

get

get(key: CacheKey) -> CacheValue | None
get(
    key: CacheKey, default: DefaultValue
) -> CacheValue | DefaultValue
get(key, default=None)

Get a value from the cache, or return a default if the key is not present.

Parameters:

Name Type Description Default

key

CacheKey

Key

required

default

DefaultValue | None

Default to return if key is not present.

None

Returns:

Type Description
CacheValue | DefaultValue | None

Either the value or a default.

grow

grow(maxsize)

Grow the maximum size to at least maxsize elements.

Parameters:

Name Type Description Default

maxsize

int

New maximum size.

required

keys

keys()

Get cache keys.

set

set(key, value)

Set a value.

Parameters:

Name Type Description Default

key

CacheKey

Key.

required

value

CacheValue

Value.

required