Press enter to see results or esc to cancel.

Lru Cache Leetcode Problem 146 [Python Solution]

In this blog post, we're going to dive deep into the Lru Cache problem, which is a popular interview question often asked by companies like Twitch.

This problem falls under the category of linked list data structures and is considered a medium-level challenge on LeetCode.

The goal of the LRU (Least Recently Used) cache problem is to design a data structure that adheres to certain constraints.

Specifically, we need to create a cache that can store key-value pairs, has a fixed capacity, and supports two main operations: get and put.

The get operation should return the value of a key if it exists in the cache, or -1 if it doesn't.

The put operation should update the value of an existing key or add a new key-value pair to the cache.

However, if the number of keys exceeds the cache's capacity, the least recently used key should be evicted.

Here's the problem in a nutshell:

Problem Statement:
Design a data structure that follows the constraints of an LRU cache.

Implement the LRUCache class:

  • LRUCache(int capacity): Initialize the LRU cache with a positive size capacity.
  • int get(int key): Return the value of the key if the key exists, otherwise return -1.
  • void put(int key, value): Update the value of the key if it exists, or add the key-value pair to the cache.

If the number of keys exceeds the capacity due to this operation, evict the least recently used key.

The get and put functions must each run in O(1) average time complexity.

Let's explore this problem in detail and provide both a brute force and an efficient Python solution.

We'll also discuss the time and space complexities of our solutions and provide a clear reasoning behind our approach.

Problem Overview

To understand this problem better, let's break down the key components and requirements.

  1. LRU Cache Constraints: The LRU cache has a fixed capacity, which limits the number of key-value pairs it can store.

  2. get Operation: When using the get operation, the cache should return the value associated with the provided key.

If the key does not exist in the cache, it should return -1.

  1. put Operation: The put operation is used to update the value associated with an existing key or add a new key-value pair to the cache.

However, if the cache exceeds its capacity after the put operation, it should evict the least recently used key.

Example 1:

Let's consider an example to illustrate how the LRU cache works:


lRUCache = LRUCache(2)


lRUCache.put(1, 1)  # Cache is {1=1}
lRUCache.put(2, 2)  # Cache is {1=1, 2=2}


lRUCache.get(1)     # Returns 1


lRUCache.put(3, 3)  # LRU key was 2, evicts key 2, cache is {1=1, 3=3}


lRUCache.get(2)     # Returns -1 (not found)


lRUCache.put(4, 4)  # LRU key was 1, evicts key 1, cache is {4=4, 3=3}


lRUCache.get(1)     # Returns -1 (not found)


lRUCache.get(3)     # Returns 3
lRUCache.get(4)     # Returns 4

This example demonstrates how the LRU cache operates, maintaining a fixed capacity and evicting the least recently used key when necessary.

Understanding LRU Cache Constraints

Before diving into the implementation, let's discuss the key concepts related to LRU cache constraints:

  • Capacity: The capacity of the LRU cache is the maximum number of key-value pairs it can store.

This capacity is fixed and should not be exceeded.

In the example, the initial capacity is 2.

  • Key-Value Pairs: The cache stores data in the form of key-value pairs, where each key is associated with a value.

In the example, we use integers as both keys and values, but the data type can vary.

  • Least Recently Used (LRU): The term "Least Recently Used" implies that when the cache reaches its capacity and a new key-value pair needs to be inserted, the cache will evict the key that has been least recently accessed.

This eviction ensures that the cache always holds the most recently used data.

  • Get Operation: The get operation allows us to retrieve the value associated with a given key.

If the key is found in the cache, the associated value is returned.

If the key is not in the cache, the operation returns -1.

  • Put Operation: The put operation serves two purposes.

If the provided key already exists in the cache, it updates the associated value.

If the key is not in the cache, a new key-value pair is added.

However, if adding the new pair would exceed the cache's capacity, the least recently used key is evicted.

LRU Cache LeetCode Problem Solution

Now that we understand the problem requirements and constraints, let's explore two different approaches to solving the LRU cache problem.

We'll start with the brute-force approach and then introduce an efficient approach that meets the O(1) time complexity requirement for both get and put operations.

1. Bruteforce Approach

In the brute-force approach, we can use a combination of data structures to implement the LRU cache.

One common choice is to use a Python dictionary (hashmap) to store the key-value pairs and a list to maintain the order of keys based on their usage.

We'll need the following data structures:

  • A Python dictionary to store the key-value pairs.
  • A list to maintain the order of keys based on their usage.

Here's a step-by-step explanation of the brute-force approach:

  1. Initialize the LRU cache with a capacity.

This involves creating an empty dictionary (hashmap) and an empty list for tracking the order of keys.

  1. Implement the get operation:

    • Check if the key exists in the dictionary.
    • If it exists, move the key to the end of the list (indicating it was most recently used) and return the corresponding value.
    • If it doesn't exist, return -1.
  2. Implement the put operation:

    • Check if the key exists in the dictionary.
    • If it exists, update the value and move the key to the end of the list.
    • If it doesn't exist, add the key-value pair to the dictionary.

    • Check if the size of the dictionary exceeds the capacity.

If it does, remove the key from the front of the list (indicating it was least recently used) and remove it from the dictionary.

Let's implement the brute-force approach in Python code:

class LRUCache:
    def __init__(self, capacity: int):
        self.capacity = capacity
        self.cache = {}
        self.order = []  # Maintain the order of keys

    def get(self, key: int) -> int:
        if key in self.cache:
            # Move the key to the end of the list to indicate it's most recently used
            self.order.remove(key)
            self.order.append(key)
            return self.cache[key]
        else:
            return -1

    def put(self, key: int, value: int) -> None:
        if key in self.cache:
            # Update the value and move the key to the end of the list
            self.cache[key] = value
            self.order.remove(key)
            self.order.append(key)
        else:
            # Add the key-value pair to the dictionary
            self.cache[key] = value
            self.order.append(key)
            # Check if the cache exceeds its capacity
            if len(self.cache) > self.capacity:
                # Remove the least recently used key from both the dictionary and the list
                lru_key = self.order.pop(0)
                del self.cache[lru_key]

This brute-force solution maintains the order of keys in the order list, and whenever a key is accessed (via get or put), it is moved to the end of the list.

When the cache exceeds its capacity, the least recently used key is evicted from both the dictionary and the list.

While this approach works, it does not meet the requirement for constant-time complexity (O(1)) for both get and put operations.

To achieve the desired efficiency, we'll introduce a more efficient approach that uses a combination of data structures.

2. Efficient Approach with a Doubly Linked List

In the efficient approach, we'll use a combination of a Python dictionary (hashmap) to store key-value pairs and a doubly linked list to maintain the order of keys based on their usage.

This approach ensures that both get and put operations have an average time complexity of O(1).

We'll need the following data structures:

  • A Python dictionary to store the key-value pairs.
  • A doubly linked list to maintain the order of keys based on their usage.

The linked list will have two dummy nodes: a left node representing the least recently used key and a right node representing the most recently used key.

Let's walk through the efficient approach step by step:

  1. Initialize the LRU cache with a capacity and set up the doubly linked list.

Create an empty dictionary to store key-value pairs.

  1. Implement the get operation:
    • Check if the key exists in the dictionary.
    • If it exists, remove the node representing the key from the linked list and insert it at the right end (indicating it's most recently used).

Return the corresponding value.
– If the key doesn't exist, return -1.

  1. Implement the put operation:
    • Check if the key exists in the dictionary.
    • If it exists, update the value and move the node representing the key to the right end of the linked list.
    • If the key doesn't exist, add the key-value pair to the dictionary and create a new node for the key.
    • Check if the size of the dictionary exceeds the capacity.

If it does, remove the least recently used key, which is the one pointed to by the left dummy node.

Here's the Python code for the efficient approach with a doubly linked list:

class Node:
    def __init__(self, key, val):
        self.key, self.val = key, val
        self.prev = self.next = None

class LRUCache:
    def __init__(self, capacity: int):
        self.cap = capacity
        self.cache = {}  # Map key to node
        self.left, self.right = Node(0, 0), Node(0, 0)
        self.left.next, self.right.prev = self.right, self.left

    # Remove a node from the linked list
    def remove(self, node):
        prev, nxt = node.prev, node.next
        prev.next, nxt.prev = nxt, prev

    # Insert a node at the right end of the linked list
    def insert(self, node):
        prev, nxt = self.right.prev, self.right
        prev.next = nxt.prev = node
        node.next, node.prev = nxt, prev

    def get(self, key: int) -> int:
        if key in self.cache:
            self.remove(self.cache[key])
            self.insert(self.cache[key])
            return self.cache[key].val
        return -1

    def put(self, key: int, value: int) -> None:
        if key in self.cache:
            self.remove(self.cache[key])
        self.cache[key] = Node(key, value)
        self.insert(self.cache[key])
        if len(self.cache) > self.cap:
            # Remove the least recently used node and delete it from the hashmap
            lru = self.left.next
            self.remove(lru)
            del self.cache[lru.key]

This efficient approach leverages a doubly linked list to maintain the order of keys and ensures that both get and put operations run in constant time complexity on average.

The left and right dummy nodes help us keep track of the least recently used and most recently used keys, making it easier to maintain the LRU cache.

Time and Space Complexity

Now that we've explored the two approaches to solving the LRU cache problem, let's analyze their time and space complexities.

Brute-force Approach:

  • Time Complexity:
    • get(key): O(n) – In the worst case, searching for a key in the list may take linear time.
    • put(key, value): O(n) – In the worst case, searching for a key and performing list operations may take linear time.
  • Space Complexity:
    • O(n) – The space complexity depends on the number of key-value pairs stored in the cache, where n is the number of key-value pairs.

The brute-force approach is not optimized for constant-time operations, and both get and put operations can have time complexity of O(n) in the worst case.

Efficient Approach:

  • Time Complexity:
    • get(key): O(1) – Searching for a key in the dictionary and updating the linked list takes constant time.
    • put(key, value): O(1) – Adding or updating a key in the dictionary and performing linked list operations takes constant time.
  • Space Complexity:
    • O(capacity) – The space complexity is determined by the cache's capacity.

The efficient approach meets the requirement of O(1) average time complexity for both get and put operations, making it a more optimal solution.

Reasoning Behind Our Efficient Approach

The reasoning behind the efficient approach lies in the use of a doubly linked list and a dictionary.

The dictionary provides fast key lookup, and

the doubly linked list maintains the order of keys based on their usage.

By using the left and right dummy nodes, we can easily track the least recently used and most recently used keys.

When we access a key, we move it to the right end of the list to indicate it's the most recently used.

When we need to evict a key, we remove it from the left end (least recently used) of the list.

This approach optimizes both the get and put operations for constant time complexity on average, meeting the requirements of the LRU cache problem efficiently.

Related Interview Questions By Company:

Related Interview Questions By Difficulty:

Related Interview Questions By Category:

Conclusion

In this guide, we explored the LRU Cache problem, a popular interview question often asked by companies like Twitch.

We discussed the problem's constraints and provided both a brute-force approach and an efficient approach to solving it.

The brute-force approach involves maintaining a list of key-value pairs and updating the list to reflect the most recently used keys.

While it works, it doesn't meet the requirement for constant-time complexity (O(1)) for both get and put operations.

The efficient approach leverages a combination of a Python dictionary and a doubly linked list.

The dictionary ensures fast key lookup, while the linked list maintains the order of keys based on their usage.

This approach meets the requirement of O(1) average time complexity for both get and put operations.

In the efficient approach, we introduced the concept of doubly linked lists, dummy nodes, and node manipulation.

This design allows us to efficiently track and manage the LRU cache, making it an optimal solution.

When tackling similar problems, it's crucial to consider both time and space complexity to arrive at efficient solutions.

The LRU cache problem is an excellent example of how data structures and algorithms can be applied to real-world scenarios.

If you found this guide helpful, please like and engage for more content.

Feel free to comment, ask questions, make suggestions, and share this valuable information with others.

Happy coding!

>