What is lru cache

Last updated: April 1, 2026

Quick Answer: An LRU cache is a data storage mechanism that maintains a fixed-size collection of frequently accessed data and automatically evicts the least recently used items when the cache reaches capacity, optimizing memory usage and access speed.

Key Facts

What is an LRU Cache?

An LRU (Least Recently Used) cache is a data structure that stores a limited collection of frequently accessed data with a constraint: when the cache reaches maximum capacity, it automatically removes the item that was accessed least recently. This approach optimizes both memory usage and data retrieval speed by keeping the most relevant data readily available.

How LRU Cache Works

The LRU cache operates on a simple principle: the most recently used items remain in the cache, while the least recently used items are evicted when space is needed. Every time an item is accessed (either read or written), its position is updated to mark it as recently used. When a new item needs to be added and the cache is full, the item that was accessed longest ago is removed to make room.

Data Structure Implementation

An efficient LRU cache implementation combines two complementary data structures:

Time and Space Complexity

LRU caches provide optimal performance characteristics. Get operations take O(1) time by using the hash map to locate the item. Put operations also take O(1) time, updating the linked list and hash map. Space complexity is O(capacity), where capacity is the maximum number of items the cache can store.

Real-World Applications

LRU caching is fundamental to modern computing systems:

Advantages and Limitations

LRU caches offer significant advantages including simple implementation, predictable behavior, and optimal performance for workloads with temporal locality. However, they assume that recently used data will likely be needed again soon, which may not hold true for all applications. Alternative policies like LFU (Least Frequently Used) or ARC (Adaptive Replacement Cache) may perform better for certain access patterns.

Related Questions

What's the difference between LRU and LFU cache?

LRU (Least Recently Used) evicts items based on when they were last accessed, prioritizing recency. LFU (Least Frequently Used) evicts items based on access frequency, prioritizing popularity. LFU can be more efficient for certain workloads but is harder to implement efficiently.

How is LRU cache different from a regular cache?

A regular cache may not have a specific eviction policy or could use random removal. LRU provides a systematic, predictable approach by always removing the least recently accessed item, making it more effective for applications with temporal locality patterns.

Can you implement an LRU cache in Python?

Yes, Python provides OrderedDict or you can implement it using a dictionary with a doubly linked list. The collections.OrderedDict from Python 3.7+ maintains insertion order, and the functools.lru_cache decorator provides a built-in LRU cache for function results.

Sources

  1. Wikipedia - Cache Replacement PoliciesCC-BY-SA-4.0
  2. Wikipedia - Locality of ReferenceCC-BY-SA-4.0