Fundamentals of Caching

The Thundering Herd Problem

I have a client who implemented the following caching strategy:

In this strategy, each server maintains his own local and in-memory cache preventing repetitive requests to the original data source.

The beauty of this strategy is that it is simple to understand and to implement. Of course, it has some drawbacks. For example, requests for the very same data would be sent to the server for each cache in the edge. Anyway, in the long run, this cache strategy is better than no cache at all.

One interesting fact here is that, if all caches are using the same expiration time, then all servers will probably ask for the same data at the same time stressing the source. That is a simple version of the thundering herd problem.

A right and creative solution are to introduce a jitter – randomizing (a little) the expiration time for each cache. This solution was introduced by Youtube (This video does not talk only about the Thundering herd problem, and it is a little bit outdated, but it is still relevant).

Another excellent resource to learning about how to solve The Thundering Problem in a much more complex scenario, I recommend this short video, from the Facebook Engineering team.

Elemar Júnior

Microsoft Regional Director and Microsoft MVP. I have been working for more than two decades developing world-class business software. I had the privilege to help to change the way Brazil sells, designs and produces furniture. Today, my technical interests are scalable architectures, database engines, and integration tools. Also, I am crazy about exponential organizations and business strategy.

You might also like

Loading posts...

More posts in Fundamentals of Caching series

Leave a Reply

Your email address will not be published. Required fields are marked *