Related Chapter: LRU Cache

#### Problem Statement:

Design and implement a data structure for Least Frequently Used (LFU) cache. It should support the following operations: get and put.

get(key) - Get the value (will always be positive) of the key if the key exists in the cache, otherwise return -1.

put(key, value) - Set or insert the value if the key is not already present. When the cache reaches its capacity, it should invalidate the least frequently used item before inserting a new item. For the purpose of this problem, when there is a tie (i.e., two or more keys that have the same frequency), the least recently used key would be evicted.

Both get and put must be O(1).

#### Solution:

Few Things About The Data Structures Used:
We will be using a DOUBLY LINKED LIST whose head would have all the keys with the least frequency, and the tail contains all the keys with the greatest frequency. The nodes are in the sorted order, sorted on the frequency. each node of the doubly linked list contains all a linkedhashset (you can also use arraylist) containing all the keys with the same frequency. The first element (at index 0) is the LEAST RECENTLY USED. (convince yourself) We keep a map to keep track of which key goes to which doubly linkedlist node.

We keep another map just to map the key to the value.

#### GET Algorithm:

If key is not present return -1.
else {
• increase the frequency by 1;
• return the value for this key from the map;
}

#### PUT Algorithm:

```if (the cache already contains the key) then {
1) just update the value in the map;
2) increase frequency of this key
- Updating an already existing key is also
seen as accessing the key; so increase frequency by 1
}
else {
if (the cache still has room for new key (i.e, capacity not yet reached))
{
1) Put the key and value in the map
2) add the key to the node with frequency set to ZERO,
since this (key, value) is newly inserted and is unused till now
- any other key has frequency of ZERO then just add this
key to the nodeList of the HEAD of the doubly linked list.
in the sorted order of the frequency with the
HEAD with the lowest frequency and the tail the highest frequency.

- if the HEAD has frequency other than ZERO, that means
all of the previously inserted keys have been used
(by get operation) at least once, so
none of the keys have frequency ZERO.
Create a new HEAD node with frequency
ZERO and put this key in the list of nodes there.
}

else (i.e, cache do not have any more capacity)
{
1) Remove the the LEAST RECENTLY USED key
with the LEAST FREQUENCY from the cache
- basically what we do is: we go to the
head and take the first element from the nodeList (element at index 0)
and remove it from the list.
needs to be removed as the HEAD has zero elements in the nodeList now.
- remove the deleted key from the cache (keyToValue map and keyToNode map)

2) Add the key and value to cache map

3) add the key to the node with frequency set to ZERO,
since this (key, value) is newly inserted and is unused till now.
- any other key has frequency of ZERO then just add this key to

sorted order of the frequency with the
HEAD with the lowest frequency and the tail the highest frequency.

- if the HEAD has frequency other than ZERO, that means
all of the previously inserted keys have been used
(by get operation) at least once, so
none of the keys have frequency ZERO.
Create a new HEAD node with frequency ZERO and put this key in the list of nodes there.
}
}
```

#### Time Complexity:

Get operation: O(1)
Put operation: O(1)

#### Drawbacks of LFU Cache in general:

While the LFU method may seem like an intuitive approach to memory management it is not without faults. Consider an item in memory which is referenced repeatedly for a short period of time and is not accessed again for an extended period of time. Due to how rapidly it was just accessed its counter has increased drastically even though it will not be used again for a decent amount of time. This leaves other blocks which may actually be used more frequently susceptible to purging simply because they were accessed through a different method.

Moreover, new items that just entered the cache are subject to being removed very soon again, because they start with a low counter, even though they might be used very frequently after that. Due to major issues like these, a pure LFU system is fairly uncommon; instead, there are hybrids that utilize LFU concepts.

Related Chapter: LRU Cache