Caching Smarter, Not Harder: Demystifying Python’s lru_cache
4 min readSep 20, 2024
Introduction to Caching
Caching is a technique that stores the result of expensive function calls and reuses them when the same inputs occur again. Python provides an easy way to implement caching using functools.lru_cache
.
What is lru_cache
?
The lru_cache
is part of Python's functools
module. It stands for Least Recently Used Cache and is a decorator that stores a function's results so that the same results can be reused if the function is called with the same arguments.
Key Features of lru_cache
:
- Max Size: The cache stores up to a specified number of results. Once this limit is reached, the least recently used result is discarded to make room for new results.
- Thread-safe:
lru_cache
is thread-safe by default. - Time Complexity: Accessing the cache is efficient because it implements a hash map and linked list.
Basic Syntax
from functools import lru_cache
@lru_cache(maxsize=128)
def expensive_function(n)…