How lru_cache Works in Python: A Deep Dive and Building Your Own

Aditya Mangal
5 min readOct 1, 2024

Caching is a powerful tool in programming to improve the performance of expensive or frequently called functions. One popular built-in caching mechanism in Python is the lru_cache decorator from the functools module. In this blog, we will explore the inner workings of lru_cache, understand its logic, and even build our custom implementation from scratch.

What is lru_cache?

The lru_cache (Least Recently Used Cache) decorator caches the results of function calls and returns the cached result when the same inputs occur again. It is useful for reducing the computation time of expensive functions that are called multiple times with the same arguments.

Let’s first understand how the lru_cache works and then build our own implementation.

How lru_cache Works in Python

The lru_cache decorator is implemented using a combination of a hash map (to store function results based on inputs) and a doubly linked list (to track the order of usage). The idea is to store the result of a function call, and when the cache reaches its size limit, the least

--

--

Aditya Mangal

My Personal Quote to overcome problems and remove dependencies - "It's not the car, it's the driver who win the race".