How to implement a time-based LRU cache in Python


When you are building applications, you may want to add a cache to parts of your application.

For example, consider a web directory that lists recipes. The directory has thousands of recipes available. You may want to add caching to database queries so that you don’t have to look up the database every time a user makes a request to the web application.

There are two caching strategies that are commonly used: time-based and Least Recently Used (LRU). You can combine them both to achieve a more desirable result.

Let’s talk about both time-based and LRU caches, how they work, what properties they have, and when you may want to implement a hybrid time-based LRU cache.

An introduction to time-based and LRU caches

A common type of cache is a time-based cache, which will cache data until an expiry date. With a time-based cache, we could store results from database queries in memory. Querying data in memory is fast and avoids a database query. Avoiding database queries may be ideal to avoid strain during peak periods of the day. Items in time-based caches have times to live, or TTLs. This indicates the period of time after which an item should be removed from the cache and the application should be asked to get new data.

With that said, a time-based cache alone has a limitation: the cache may grow to a large size.

A common mitigation for this limitation is to use a cache with a Least Recently Used (LRU) expiry capability. LRU caches have a maximum size. If the maximum size is exceeded, the least recently accessed item will be removed from the cache.

Alone, items in LRU caches do not expire after time periods like a time-based cache. Instead, the least recently accessed item is removed to make way for a new item when a new item is added.

Combining time-based and LRU caches

We can solve our problem with a time-based LRU cache. This approach combines the best of both worlds: we can add items that expire after a certain time to live (TTL), and remove the least recently used item. With this setup, items can expire if:

  1. The TTL for the item has passed, or;
  2. The item is at the end of the LRU queue. This scenario means every other item has been accessed more recently.

In this guide, we are going to implement a time-based LRU cache.

Note: Implementations of time-based LRU caches already exist in Python. This guide is designed to walk behind the scenes to show how you can implement the concept from scratch.

The implementation

Defining the structure of the code

To implement our time-based LRU cache, we will define a class. This class needs to store three values:

  1. A cache, which stores cached items;
  2. A maximum cache size, which will be used to enforce our LRU cache, and;
  3. A TTL that applies to all items.

This class will have three methods:

  1. An constructor that intializes the values for our class;
  2. A __getitem__ method that runs whenever you try to retrieve an item from the cache, and;
  3. A __setitem__ method that runs whenever you try to add an item into the cache.

Here is the structure of the class in code:

from collections import OrderedDict

class TimedLRUCache:
    def __init__(self, max_size = 2, ttl = 1):
        self.cache = OrderedDict()
        self.max_size = max_size
        self.ttl = ttl

    def __getitem__(self, key):
      	print(key, "item retrieved")
        pass
    
    def __setitem__(self, key, value):
      	print(key, "item added with value", value)
        pass

self.cache will store our cached values. We use an OrderedDict for the cache. This class implements an ordered dictionary with several convenience features for understanding and manipulating the values in the dictionary. This data structure is ideal because all items in our cache can have keys and values, but also preseve the order they were inserted. This order can be used to implement the LRU part of our cache.

__getitem__ and __setitem__ are special Python values that run whenever a class is treated like a dictionary.

Let’s test the code above to see how they work:

cache = TimedLRUCache(ttl = 1, max_size = 3)

cache["exile"] = "folklore"
print(cache["exile"])

Our code returns:

exile item added with value folklore
exile item retrieved

The first sentence was printed when we assigned the value folklore to exile. The second sentence was printed when we tried to retrieve the exile value from our cache.

With this code, we can treat adding and removing items from our cache like we would a dictionary.

Defining cache states

A valid response from a cache can be in one of two states:

  • HIT, which indicates that the item was in the cache;
  • MISS, which indicates that the item was not in the cache.

If a MISS is returned, this tells us that the requested item has either expired from the cache or was never in the cache. If we get a MISS, it means we need to retrieve an item from scratch (i.e. from a database) then add it to the cache so it can be used later.

Let’s define these values formally as an enum so we can refer to them as Python objects rather than strings:

from enum import Enum

class CacheReturnState(Enum):
    HIT = "HIT"
    MISS = "MISS"

We will use these when we write the code to retrieve items from the cache.

Defining the setitem method

When we add an item to our cache, we want to:

  1. Set the value of the item;
  2. Set the time it was last retrieved to the current time;
  3. Remove the least recently used item from the cache if the cache has reached its maximum size.

We can do so using the following code:

def __setitem__(self, key, value):
    self.cache[key] = {"value": value, "added": datetime.now()}

    if len(self.cache) > self.max_size:
        self.cache.popitem(last = False)

Every item in the cache is a dictionary. It has two values:

  1. Whatever the user provided (the value), and;
  2. The date the item was added to the dictionary.

We remove the item that was accessed least recently using self.cache.popitem(). This is a method available in the OrderedDict class we use to define our class. It removes the item in the dictionary that was added least recently.

We can test that our code lets us set a value like so:

cache = TimedLRUCache(ttl = 1, max_size = 3)

cache["exile"] = "folklore"
print(cache.cache["exile"])

Our code returns:

We successfully associate the key we provided (exile) with the value (folklore) and the date and time the item was added.

Defining the getitem method

Now that we can set items in our dictionary, we can write the logic to retrieve items.

When we retrieve items, we need to do a few checks.

First, we need to check if an item is not in the cache. If the item is not in the cache, we are going to return a MISS. This indicates the item was not in the cache. Note that a MISS does not mean an item was never in the cache. It means that, at the time the item was requested, the item was not in the cache.

If a MISS value is returned, this indicates we need to retrieve a value from its source (i.e. from the database) then save it in the cache.

We can implement this check like so:

def __getitem__(self, key):
    if key not in self.cache:
        return None, CacheReturnState.MISS.value

Second, we need to check if an item has expired. We can do this by checking if the difference in seconds between the time an item was requested and the added time was greater than the TTL. For example, if the cache has a TTL of 10 seconds and an item was added 11 seconds ago, that item is expired.

If an item has expired, we need to delete the item from the cache, then return a MISS. We are returning a MISS because the item has expired. If we returned a HIT with the expired value in our cache, we would be returning data that we know is out of date, which does not comply with the expiry date requirement of our cache.

We can remove out of date values from our cache like so:

def __getitem__(self, key):
    if key not in self.cache:
        return None, CacheReturnState.MISS.value

    if (datetime.now() - self.cache[key]["added"]).total_seconds() > self.ttl:
        del self.cache[key]
        return None, CacheReturnState.MISS.value

If the above two conditions are not met, it means that the item is in the cache. At this stage, there are two more things to do:

  1. Move the item to the top of the OrderedDict in self.items. By moving the item to the top of the dictionary, we can store the fact that the item was recently used, and;
  2. Return the item to the user.

We can do so like this:

def __getitem__(self, key):
    if key not in self.cache:
        return None, CacheReturnState.MISS.value

    if (datetime.now() - self.cache[key]["added"]).total_seconds() > self.ttl:
        del self.cache[key]
        return None, CacheReturnState.MISS.value

    self.cache.move_to_end(key)

    return self.cache[key]["value"], CacheReturnState.HIT.value

We now have all of the code we need to add items to our cache.

Testing the cache

There are four possible conditions when we try to retrieve an item from our cache:

  1. The item is in the cache, so it is returned with a HIT.
  2. The item is not in the cache, so it is returned with a MISS.
  3. The item has expired from the cache because the difference between retrieval time and the added date is greater than the TTL, so the cache returns a MISS.
  4. The item has expired fron the cache because the max_length of the cache was reached at some point when the item was at the bottom of the cache, so the cache returns a MISS.

Let’s test all of these conditions one by one.

Initialize a new cache

First, we need to initialize our cache with some items:

data = OrderedDict({
    "exile": "folklore",
    "evermore": "folklore",
    "say don't go": "1989"
})

cache = TimedLRUCache(ttl = 2, max_size = 3)

for key, value in data.items():
    cache[key] = value

We have initialized a cache with a TTL of 2 seconds and a max_size of three items.

Query a valid item

Let’s try to query an item:

Our code returns:

Because the item was in the cache, the TTL had not passed, and the item was not at the bottom of the cache and expired at any time, our code returned a result.

Query an item that doesn’t exist

Let’s try to query an item that doesn’t exist in our cache:

Our code returns:

This is because our item is not in our cache.

Query an item that has expired due to TTL

Now, let’s see what happens when we query the cache after a three second delay:

import time

time.sleep(3)

print(cache["exile"])

Our code returns:

Because the item was added greater than ttl seconds ago, the value for exile had expired so None was returned.

Query an item that has expired due to LRU requirements

To test the LRU capabilities of our cache, let’s try to add a new item then retrieve the first item added to the cache:

cache["betty"] = "folklore"

print(cache["exile"])

This code returns:

Our cache started with three items. Our code above added a fourth item. When this item was added, the least recently. used item was removed from the cache. This was the value for the key exile. When we added betty to the cache, exile was removed. Thus, our query for exile returned a cache MISS.

Conclusion

Time-based LRU caches are a useful caching mechanism when you want a cache with the following characteristics:

  • Items expire after a period of time.
  • The least recently used item is removed if the cache exceeds a pre-defined length.

With a time-based LRU cache, popular entries in the cache are likely to stay there, since every time an entry is retrieved it is moved to the top of the LRU queue. But, items will still expire over time, which means the values associated with commonly-retrieved entries will be refreshed according to the time to live provided.

In this guide, we walked through how to implement a time-based LRU cache in Python. We defined a class that uses the special __getitem__ and __setitem__ methods to allow us to add and set items in the cache with the same syntax one would use to add items to a Python dictionary. We implemented these methods to enforce our time-based and least recently used requirements. We then tested the cache on several examples that show the valid behaviours of the cache.

The code for this project is available on GitHub.



Source link

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top