An LRU (Least Recently Used) Cache is a data structure that keeps track of the most recently used items and efficiently removes the least recently used ones when it reaches capacity. It’s commonly used to optimize memory usage in systems that handle large data sets by retaining only frequently accessed items.
In this blog, we’ll walk through the key concepts of an LRU Cache and then build it in Swift using Doubly Linked Lists
and Dictionaries
.
What is an LRU Cache?
An LRU Cache operates on a principle of maintaining the most frequently accessed items while discarding the least accessed ones. It functions with the following behaviors:
- Get: Retrieve a value based on a key. If the key exists, it’s marked as the most recently used.
- Put: Insert or update a key-value pair. If the cache exceeds capacity, it removes the least recently used item.
This allows for a cache where lookup, insertions, and deletions can be done in constant time, O(1)
.
Design of an LRU Cache
To implement an LRU Cache, we need two primary components:
- Doubly Linked List: Helps in constant time removal of the least recently used item.
- Dictionary (Hash Map): Maps keys to nodes in the doubly linked list, allowing quick access.
We’ll be implementing this in Swift, where:
- The dictionary stores key-node pairs.
- The doubly linked list maintains the order of usage, with the most recently used items at the head and the least recently used at the tail.
Step-by-Step Implementation of an LRU Cache in Swift
Step 1: Create the Doubly Linked List Node
Each node in our doubly linked list will hold a key, value, and pointers to the next and previous nodes.
class DoublyLinkedListNode<Key, Value> {
var key: Key
var value: Value
var next: DoublyLinkedListNode?
var prev: DoublyLinkedListNode?
init(key: Key, value: Value) {
self.key = key
self.value = value
}
}
Step 2: Implement the Doubly Linked List
The doubly linked list will maintain the order of items, with the most recently used items at the head and the least recently used items at the tail. We’ll add functions to add a node to the head and to remove a node from the list.
class DoublyLinkedList<Key, Value> {
private var head: DoublyLinkedListNode<Key, Value>?
private var tail: DoublyLinkedListNode<Key, Value>?
init() {
head = DoublyLinkedListNode(key: "head" as! Key, value: "head" as! Value)
tail = DoublyLinkedListNode(key: "tail" as! Key, value: "tail" as! Value)
head?.next = tail
tail?.prev = head
}
func addNodeToHead(_ node: DoublyLinkedListNode<Key, Value>) {
node.next = head?.next
node.prev = head
head?.next?.prev = node
head?.next = node
}
func removeNode(_ node: DoublyLinkedListNode<Key, Value>) {
node.prev?.next = node.next
node.next?.prev = node.prev
}
func removeTail() -> DoublyLinkedListNode<Key, Value>? {
if let tailNode = tail?.prev, tailNode !== head {
removeNode(tailNode)
return tailNode
}
return nil
}
}
Step 3: Implement the LRU Cache
With the linked list structure in place, we can now implement the main LRUCache
class. This class will contain methods for get
and put
, as well as a dictionary to store the cache entries.
class LRUCache<Key: Hashable, Value> {
private var capacity: Int
private var cache: [Key: DoublyLinkedListNode<Key, Value>]
private var linkedList: DoublyLinkedList<Key, Value>
init(capacity: Int) {
self.capacity = capacity
self.cache = [Key: DoublyLinkedListNode<Key, Value>]()
self.linkedList = DoublyLinkedList<Key, Value>()
}
func get(_ key: Key) -> Value? {
if let node = cache[key] {
linkedList.removeNode(node)
linkedList.addNodeToHead(node)
return node.value
}
return nil
}
func put(_ key: Key, _ value: Value) {
if let node = cache[key] {
linkedList.removeNode(node)
node.value = value
linkedList.addNodeToHead(node)
} else {
if cache.count >= capacity, let tailNode = linkedList.removeTail() {
cache.removeValue(forKey: tailNode.key)
}
let newNode = DoublyLinkedListNode(key: key, value: value)
linkedList.addNodeToHead(newNode)
cache[key] = newNode
}
}
}
Explanation of get and put Methods
get(key):
- Checks if the key exists in the cache.
- If it does, the node is moved to the head of the linked list (marking it as the most recently used).
- Returns the value if found, or nil if not.
put(key, value):
- If the key exists, it updates the node’s value and moves it to the head.
- If the key does not exist:
- If the cache has reached its capacity, it removes the least recently used item from the linked list’s tail and deletes it from the dictionary.
- Then, it creates a new node, adds it to the head of the linked list, and adds it to the dictionary.
Example Usage
Let’s create an LRU cache with a capacity of 2 and test it out.
let lruCache = LRUCache<String, Int>(capacity: 2)
// Adding items
lruCache.put("A", 1)
lruCache.put("B", 2)
// Accessing an item makes it the most recently used
print(lruCache.get("A")!) // Output: 1
// Adding another item evicts the least recently used ("B")
lruCache.put("C", 3)
print(lruCache.get("B")) // Output: nil (as "B" is evicted)
print(lruCache.get("C")!) // Output: 3
print(lruCache.get("A")!) // Output: 1
// Adding another item when cache is full, evicts "A" (the LRU)
lruCache.put("D", 4)
print(lruCache.get("A")) // Output: nil
print(lruCache.get("C")!) // Output: 3
print(lruCache.get("D")!) // Output: 4
How It Works
- Cache Initialization: The cache starts with a capacity of 2.
- Item Access: Accessing or adding an item marks it as recently used by moving it to the head of the linked list.
- Capacity Limit: When the cache reaches capacity, adding a new item removes the least recently used item (from the tail of the list).
- Constant-Time Operations: With the combination of the doubly linked list and dictionary, lookups, insertions, and deletions are handled in constant time,
O(1)
.
Advantages and Use Cases
The LRU Cache pattern is widely used in performance-critical systems to handle large data by retaining only the most relevant data. Here are some common use cases:
- Web Browsers: Caching pages that users frequently visit.
- Database Query Caches: Storing results of frequently run queries.
- Memory Management: Systems with limited memory that need to evict the least used data to make room for new data.
Conclusion
Implementing an LRU Cache in Swift using a combination of a Doubly Linked List
and a Dictionary
achieves efficient time complexity and effective memory usage. With this structure, your cache can operate in constant time and manage limited resources effectively, making it an excellent choice for applications that require high-performance data caching.
This implementation provides a solid foundation for understanding caching principles and can be further extended to support additional features like custom eviction policies and time-to-live (TTL) for items.