Posts LRU Cache Using Go in under 100 lines!
Post
Cancel

LRU Cache Using Go in under 100 lines!

Vague introduction to LRU Caches

Hello fellow internet user! I assume you know what caches are and how they work. Today let’s see how we can design an LRU cache using our favourite language Go with as little code as possible.

LRU when expanded is Least Recently Used.

So it basically means whichever entry is the least used after we put it, will be kicked out and replaced by a new entry. This is of course limited by the capacity of the cache. In an ideal world if you have unlimited storage capacity you don’t need this at all. So go back to enjoying life without performance issues.

I recommend you learn about how LRU caches work because we’re here to implement the same in Go but not understand the mechanics of it.

Needed functions

Just like all caches we will need two most important functions which work on it.

  • Put(key, value) - Which creates an entry at a key.
  • Get(key) - Which retrieves a values at a key and returns nil if it doesn’t find it.

A little theory

We will need a hashmap or a dictionary or a key value store if you will, and a doubly linked list(DLL) to implement an LRU cache.

The DLL will store the actual values of the cache and the map will store the address of the value in the DLL against a key.

So the map looks something like this:

{
    "key": <node-address-of-DLL>
}

Are we going to implement a DLL from scratch? No. We will use the container/list data structure in Go’s standard library which is a fully functional and tested implementation of a DLL.

Put

We will work with integers for now so let us initialize an LRUCache object first and then a hashmap:

1
2
3
4
5
type LRUCache struct {
    Cache *list.List
}

var hashMap = map[int]*list.Element{}

We implement a receiver function on this object called Put().

Now the Put method creates an entry in the cache if there is extra space in it to create data. If the capacity overflows, it removes the least recently used entry(from the back of the DLL) and proceeds to push the new entry to the front.

1
2
3
4
5
6
7
8
9
10
11
func (l *LRUCache) Put(key int, value int) interface{} {
    if l.Cache.Len() < cacheLength {
        elem := l.Cache.PushFront(value)
        hashMap[key] = elem
        return value
    }
    l.Cache.Remove(l.Cache.Back())
    elem := l.Cache.PushFront(value)
    hashMap[key] = elem
    return value
}

When we push an entry to a DLL it returns the memory address of the same, which we store in the hashmap.

Get

Get isn’t necessarily just a read operation in case of an LRU cache. Since we’re keeping track of the least recently used entries, we make sure the entries being retrieved by Get are available at the front of the DLL for immediate future use. We do that just by removing the node we just retrieved, push the value at the front again and store the new address in the hashmap.

1
2
3
4
5
6
7
8
9
10
11
func (l *LRUCache) Get(key int) interface{} {
    node := hashMap[key]
    if node == nil {
        return -1
    }
    value := node.Value
    l.Cache.Remove(node)
    newPositionElem := l.Cache.PushFront(value)
    hashMap[key] = newPositionElem
    return value
}

We implement a Print function to iterate the DLL to know the state of the cache at runtime.

1
2
3
4
5
6
7
func (l *LRUCache) Print() {
    for element := l.Cache.Front(); element != nil; element = element.Next() {
        fmt.Print(element.Value, "-->")
    }
    fmt.Println()
}

API

Let’s add an API around the LRU cache we just created with two endpoints, GET and PUT. It’s a simple API. We’re not adding extra validations for HTTP methods or json serializtion.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
func GetRoute(w http.ResponseWriter, r *http.Request) {
    params := mux.Vars(r)

    getValue := params["string"]
    value := cache.Get(getValue)
    cache.Print()
    bytes, _ := json.Marshal(value)

    _, _ = w.Write(bytes)
}

func PutRoute(w http.ResponseWriter, r *http.Request) {
    params := mux.Vars(r)

    value := params["integer"]
    i, err := strconv.Atoi(value)
    if err != nil {
        return
    }
    key := strconv.Itoa(i)
    insertedValue := cache.Put(key, i)
    cache.Print()
    bytes, _ := json.Marshal(insertedValue)

    _, _ = w.Write(bytes)
}

Here’s the gist of the whole implementation - LRU Cache

Make sure you check the logs to know how the DLL is being manipulated.

Conclusion

Let me know if I can improve better or any areas where I have gone wrong.

Please reach out to me via email(or any social media linked down below) if you think I haven’t covered something which you consider important.

Thank you 😁

This post is licensed under CC BY 4.0 by the author.