One Pager Cheat Sheet
- An overview of LFU caching and its various
implementations
will be discussed in this tutorial. - The
LFU Cache
algorithmmeasures frequency
and replaces theblock with least frequency
when the cache isfull
. - LFU Cache
evicts
blocks based on frequency,regardless of recent access
, while LRU Cacheevicts
blocks based on least recent access. - Design a data structure for an
LFU Cache
with operations toput
andget
values, and a constructor to set the initialcapacity
which also handlesreplacing the least frequently used
andleast recently used
keys whenfull
. - The
LFU
cache evicts the least frequently used memory block, regardless of when it was last accessed, by keeping track of access counters for each key. - Implementing an LFU Cache using HashMap with O(1) access time and O(n) time to replace entries in a full cache.
- The main advantage of a HashMap is that it provides
constant time
access to items, regardless of the size of the map. - We can implement an LFU Cache using a Singly Linked List, which will allow us to access the element that has been least recently used in O(1) time, but the worst complexity for updating will be O(n).
- We can implement an LFU Cache using a Doubly Linked List, taking O(1) time for insertions/updates and lookups.
- It is much more efficient to use a Doubly Linked List to locate the least recently accessed entry with the lowest frequency in
O(1)
time, as opposed to a Singly Linked List that requiresO(n)
time. - We
implemented
anLFU cache
using HashMap, Singly Linked List, and Doubly Linked List.