Design a data structure that follows the constraints of a Least Recently Used (LRU) cache. Implement the LRUCache
class:
LRUCache(int capacity)
Initialize the LRU cache with positive size capacity
.int get(int key)
Return the value of the key
if the key exists, otherwise return -1
.void put(int key, int value)
Update the value of the key
if the key
exists. Otherwise, add the key-value
pair to the cache. If the number of keys exceeds the capacity
from this operation, evict the least recently used key.The functions get
and put
must each run in O(1)
average time complexity.
Example:
Input
["LRUCache", "put", "put", "get", "put", "get", "put", "get", "get", "get"]
[[2], [1, 1], [2, 2], [1], [3, 3], [2], [4, 4], [1], [3], [4]]
Output
[null, null, null, 1, null, -1, null, -1, 3, 4]
Explanation
LRUCache lRUCache = new LRUCache(2);
lRUCache.put(1, 1); // cache is {1=1}
lRUCache.put(2, 2); // cache is {1=1, 2=2}
lRUCache.get(1); // return 1
lRUCache.put(3, 3); // LRU key was 2, evicts key 2, cache is {1=1, 3=3}
lRUCache.get(2); // returns -1 (not found)
lRUCache.put(4, 4); // LRU key was 1, evicts key 1, cache is {4=4, 3=3}
lRUCache.get(1); // return -1 (not found)
lRUCache.get(3); // return 3
lRUCache.get(4); // return 4
When you get asked this question in a real-life environment, it will often be ambiguous (especially at FAANG). Make sure to ask these questions in that case:
The LRU Cache stores items and discards the least recently used one when full. A brute force approach would check all possible arrangements of items to decide which to discard, making it slow but theoretically correct.
Here's how the algorithm would work step-by-step:
class LRUCacheBruteForce:
def __init__(self, capacity):
self.capacity = capacity
self.cache = {}
self.access_history = {}
self.current_time = 0
def get(self, key):
if key in self.cache:
self.access_history[key] = self.current_time
self.current_time += 1
return self.cache[key]
return -1
def put(self, key, value):
if len(self.cache) == self.capacity:
# Need to evict the least recently used key
least_recently_used_key = None
least_recent_time = float('inf')
for cached_key, last_access_time in self.access_history.items():
if last_access_time < least_recent_time and cached_key in self.cache:
least_recent_time = last_access_time
least_recently_used_key = cached_key
# Remove the least recently used item from the cache and access history
if least_recently_used_key is not None:
del self.cache[least_recently_used_key]
del self.access_history[least_recently_used_key]
self.cache[key] = value
# Update access history with the current time
self.access_history[key] = self.current_time
self.current_time += 1
The LRU Cache problem requires us to quickly access and update data while also evicting the least recently used items when the cache is full. The most efficient way to do this is by combining a dictionary for fast lookups with a special data structure that remembers the order in which items were used.
Here's how the algorithm would work step-by-step:
class LRUCache:
def __init__(self, capacity_value: int):
self.capacity_value = capacity_value
self.cache_map = {}
self.data_list = []
def get(self, key: int) -> int:
if key in self.cache_map:
# Update the recency of the key
self.data_list.remove(key)
self.data_list.insert(0, key)
return self.cache_map[key]
return -1
def put(self, key: int, value: int) -> None:
if key in self.cache_map:
self.data_list.remove(key)
self.data_list.insert(0, key)
self.cache_map[key] = value
# Enforce capacity constraint
if len(self.data_list) > self.capacity_value:
# Remove least recently used
least_recent_key = self.data_list[-1]
# Delete LRU entry
del self.cache_map[least_recent_key]
# Remove last element
self.data_list.pop()
Case | How to Handle |
---|---|
Capacity is zero | The cache should not store any elements, so all 'put' operations should be ignored and 'get' operations should always return -1. |
Inserting a key that already exists with a new value | The existing key's value should be updated, and the key should be moved to the most recently used position. |
Retrieving a key that exists | The key should be moved to the most recently used position. |
Cache is full and a new key needs to be inserted | The least recently used key should be evicted to make space for the new key. |
Multiple 'put' operations with the same key in quick succession | The last 'put' operation should determine the key's value and position in the cache. |
Large number of 'get' and 'put' operations exceeding available memory | The solution should be memory-efficient and avoid memory leaks to handle prolonged usage. |
Negative or very large key values | The solution should handle any valid integer key value, considering possible integer overflow in calculations if any. |
Cache contains only one element, and it is accessed frequently | The element should always be considered the most recently used and not be evicted. |