DEV Community

Cover image for Make your app faster - Use Caching 💨⚡️
Tech Vision
Tech Vision

Posted on • Edited on

Make your app faster - Use Caching 💨⚡️

Speed is currency. Fast, responsive applications not only retain users but also boost revenue. One way to improve performance is through caching. Let's see what caching is and how it enhances performance. We'll also see how caching goes beyond performance and has other benefits.

If you prefer the video version here is the link 😉:

What's caching?

Any interaction between two entities involves computation time and data transport time. To increase the interaction speed, we need to reduce those two factors. In a network interaction, we might want to reduce the transport time, while in the case of an application querying a database, we might want to optimise the query computation.

Caching is a technique that aims to reduce response time by taking a shortcut instead of going through the whole interaction. The idea is to store the result of an expensive computation or the result of a previous interaction and reuse it when the same computation or interaction is needed again.

Caching overview animation

Caching everywhere

Let's take your browser as an example. When you visit a website, your browser stores the files that make up the website on your hard drive. The next time you visit the website, your browser will load the files from your hard drive instead of downloading them again from the server.

Caching is not limited to the browser. A CDN server could be sending you a cached version of the website you are visiting. A server could send a cached response instead of generating a new response for each request or querying fresh data from a database. Databases themselves have a cache. They might cache frequent queries and save on lookup time. Caching goes all the way to the hardware. CPUs have a sophisticated caching mechanism that optimises computation time and reduces access to slow hard drive memory.

Caching of network call at different stages

As you can see, caching happens at different levels and places. No matter when and where it happens, the principle is the same: we want to move the data closer to the consumer and potentially use more performant storage.

Dealing with stale data

If the data at the source changes, the cached data is considered stale. A common approach to deal with stale data is to set a time to live (TTL) for the cached data. When data has passed its TTL he data is considered stale, and the next time the data is requested, the cache will fetch the data from the source and store it again. Other strategies involve invalidating the cache when the source data changes, either by pushing the change to the cache or by having the cache check the source for changes.

Cache storage is limited in size and requires cache eviction when the capacity is reached. Strategies for cache eviction include removing the least recently used, least frequently used, or most expensive data. The chosen strategy depends on the use case and caching implementation.

Caching adds complexity to your system. You should only add caching when you have identified a performance problem and measured that caching will solve it.

Even if performance is not a primary concern, there are other reasons you might want to use caching.

Hidden benefits

You could have a server that receives a high volume of requests. For each request, your server sends a query to your database. Your server could handle the load, but your database might not. If you have a cache in front of the database, you can reduce the load and improve the overall system performance.

Every query sent to your database requires computation resources. Additionally, if your database is not part of your server, each query may require a network call. Remember that network traffic and computation time are not free. You could make substantial savings on your cloud or infrastructure bill if your server caches query results.


Caching is a powerful technique to improve performance. It's not only about reducing response time; it's also about reducing the load on the source and the network. Caching adds complexity to your system, so use it when you have identified a performance issue worth solving.

Caching is a fundamental concept to understand. Understanding caching will help you understand other concepts such as CDN, Load Balancer, Proxy server, etc.

Top comments (0)