Design

Cache

LRU

class Node:
    def __init__(self, key=0, value=0, prev=None, next=None):
        self.key = key
        self.val = value
        self.prev = prev
        self.next = next

class LRUCache:
    def __init__(self, capacity: int):
        self.capacity = capacity
        self.mapper = {}
        self.head = Node()
        self.tail = Node()
        self.head.next = self.tail
        self.tail.prev = self.head
    
    def _insert(self, node):
        prev = self.tail.prev
        node.prev = prev
        prev.next = node
        self.tail.prev = node
        node.next = self.tail
        self.mapper[node.key] = node
    
    def _remove(self, node):
        prev, next = node.prev, node.next
        prev.next = next
        next.prev = prev
        node.prev = None
        node.next = None
        del self.mapper[node.key]
    
    def get(self, key: int) -> int:
        if key not in self.mapper:
            return -1
       	node = self.mapper[key]
        self._remove(node)
        self._insert(node)
        return node.val
    
    def put(self, key: int, value: int) -> None:
        if key in self.mapper:
            node = self.mapper[key]
            node.val = value
            self._remove(node)
            self._insert(node)
        else:
            if len(self.mapper) == self.capacity:
                self._remove(self.head.next)
            node = Node(key, value)
            self._insert(node)

LFU

HashTable

  1. Load Factor: Monitor the load factor (elements vs. slots), and expand when it exceeds a threshold (usually around 0.7 to 0.8) to avoid performance degradation.

  2. Expansion Strategy: Decide how much to increase the size when expanding, with doubling the size being a common choice.

  3. Rehashing: Rehash existing elements efficiently when expanding by recalculating hash codes and redistributing them.

  4. Timing: Choose when to trigger expansion—immediately or deferred—to balance collision prevention and computational cost.

  5. Concurrent Access: Handle expansion carefully in concurrent environments, considering synchronization.

  6. Memory: Be mindful of temporary memory usage increase during expansion.

  7. Performance Testing: Test and benchmark to ensure your chosen parameters align with your use case and performance requirements.

Queue

Heap

A linear time heapify operation can be performed by starting from the middle of the heap and working your way up to the root, ensuring that the subtree rooted at each node satisfies the heap property.

Priority Queue

Stack

Min Stack

Max Stack

Union-Find

File System

Last updated