Problem Statement
The LRU (Least Recently Used) Cache problem requires us to design a data structure that follows the constraints of a Least Recently Used cache.
A cache should support the following operations efficiently:
- get(int key): Retrieve the value associated with the key if it exists; otherwise, return
-1
. - put(int key, int value): Insert or update the value associated with the key. If the cache reaches its capacity, it should evict the least recently used key before inserting a new key.
The operations must be performed in O(1) time complexity.
Optimal Approach
To achieve O(1) complexity for both get
and put
operations, we use:
- HashMap (unordered_map in C++) for O(1) lookup.
- Doubly Linked List for fast removal and insertion in O(1) time.
- The hashmap stores key-value pairs where the key maps to a node in the linked list.
- The doubly linked list maintains the order of use, ensuring that the most recently used element stays at the front and the least recently used element is at the back.
C++ Implementation
class LRUCache {
private:
int capacity;
list<pair<int, int>> cacheList;
unordered_map<int, list<pair<int, int>>::iterator> cacheMap;
public:
LRUCache(int capacity) {
this->capacity = capacity;
}
int get(int key) {
if (cacheMap.find(key) == cacheMap.end()) {
return -1;
}
auto it = cacheMap[key];
int value = it->second;
cacheList.erase(it);
cacheList.push_front({key, value});
cacheMap[key] = cacheList.begin();
return value;
}
void put(int key, int value) {
if (cacheMap.find(key) != cacheMap.end()) {
cacheList.erase(cacheMap[key]);
} else if (cacheList.size() >= capacity) {
int lruKey = cacheList.back().first;
cacheList.pop_back();
cacheMap.erase(lruKey);
}
cacheList.push_front({key, value});
cacheMap[key] = cacheList.begin();
}
};
Step-by-Step Explanation
- Initialization: We use a doubly linked list to store cache entries and a hashmap to store pointers to list nodes.
- get() Operation:
- If the key is not found, return
-1
. - Otherwise, move the key-value pair to the front of the list (marking it as recently used).
- If the key is not found, return
- put() Operation:
- If the key already exists, remove it from the linked list and hashmap.
- If the cache reaches capacity, remove the least recently used key from the linked list and hashmap.
- Insert the new key-value pair at the front of the linked list and update the hashmap.
Dry Run / Execution Steps
Example
Input:
LRUCache lru(2);
lru.put(1, 1);
lru.put(2, 2);
cout << lru.get(1) << endl; // Returns 1
lru.put(3, 3); // Removes key 2
cout << lru.get(2) << endl; // Returns -1 (not found)
lru.put(4, 4); // Removes key 1
cout << lru.get(1) << endl; // Returns -1 (not found)
cout << lru.get(3) << endl; // Returns 3
cout << lru.get(4) << endl; // Returns 4
Output:
1
-1
-1
3
4
Alternative Approaches & Why Not?
Approach | Time Complexity | Space Complexity | Why Not? |
---|---|---|---|
Brute Force (Iterate through list for every operation) | O(N) | O(N) | Too slow for large inputs |
Using Vector | O(N) | O(N) | Deletion in the middle takes O(N) |
Using Singly Linked List & HashMap | O(1) | O(N) | Needs extra pointers to handle deletions |
Best Solution & Why It’s Best
The Doubly Linked List + HashMap approach is the best because:
- O(1) operations: Both
get()
andput()
run in constant time. - Memory Efficiency: Uses minimal extra space apart from hashmap and linked list nodes.
Complexity Analysis
- Time Complexity: O(1) for both
get()
andput()
operations. - Space Complexity: O(capacity) for storing keys in the hashmap and linked list.
Conclusion
The LRU Cache problem is efficiently solved using a Doubly Linked List + HashMap approach. This ensures O(1) time complexity for both get()
and put()
operations, making it ideal for real-world caching mechanisms.
Similar Problems for Practice:
- LFU Cache (Least Frequently Used)
- Design Twitter (LeetCode 355)
- Implement Stack using Queues
This approach is widely used in operating systems, database query optimization, and web caching mechanisms.
Happy Coding! 🚀