When building a web application, we often need to perform certain actions that are expensive. Either because they are computationally heavy, they take a long time to complete, or require an external API call that is expensive. Of course, there are many examples of this, but these are some of the most common ones.
In many cases, a simple solution is to use caching. Caching is a technique that allows us to store the results of a certain action so we don't have to perform the action again if the same data is requested again.
There are many different approaches to caching, but in this post, I'll show you how to use the lru-cache
package to implement an LRU Cache in Node.js with TypeScript.
Setting Up the LRU Cache
To start, we'll need to install the lru-cache
package.
npm install lru-cache
Then, we'll set up an LRU Cache to store user data. This cache will have a maximum size of 5, meaning it can hold up to 5 user objects at a time. Here's how we initialize it:
import { LRUCache } from 'lru-cache';
const userCache = new LRUCache<number, User>({ max: 5 });
Fetching Data from an API
Next, we need to simulate fetching data from an external API. We'll create a function called fetchUserFromAPI
that takes a user ID and returns a user object. This function will include a delay to mimic the time it takes to fetch data over the network.
async function fetchUserFromAPI(userId: number): Promise<User | null> {
console.log(`Fetching user data for ID: ${userId} from API...`);
await new Promise(resolve => setTimeout(resolve, 500));
const users: User[] = [
{ id: 1, name: 'Alice', email: 'alice@example.com' },
{ id: 2, name: 'Bob', email: 'bob@example.com' },
{ id: 3, name: 'Charlie', email: 'charlie@example.com' },
];
const user = users.find((user) => user.id === userId);
return user || null;
}
Using the LRU Cache
Now, let's create a function called getUser
that uses our LRU Cache. This function will first check if the user data is already in the cache. If it is, we'll return the cached data. If not, we'll fetch the data from the API and add it to the cache.
async function getUser(userId: number): Promise<User | null> {
const cachedUser = userCache.get(userId);
if (cachedUser) {
console.log(`User data for ID: ${userId} found in cache.`);
return cachedUser;
}
const user = await fetchUserFromAPI(userId);
if (user) {
userCache.set(userId, user);
}
return user;
}
Testing the LRU Cache
To see our LRU Cache in action, we'll create a main
function that makes several requests for user data. This will demonstrate how the cache works and how it evicts the least recently used items when it's full.
async function main() {
// First request, will fetch from API
console.log('First Request')
let user1 = await getUser(1);
console.log('User 1:', user1);
console.log('-------------------')
// Second request for the same user, will be served from cache
console.log('Second Request')
user1 = await getUser(1);
console.log('User 1:', user1);
console.log('-------------------')
// Request for a different user, will fetch from API
console.log('Third Request')
const user2 = await getUser(2);
console.log('User 2:', user2);
console.log('-------------------')
// Request for a new user, will fetch from API
console.log('Fourth Request')
const user3 = await getUser(3);
console.log('User 3:', user3);
console.log('-------------------')
// Request for the first user again, will be served from the cache
console.log('Fifth Request')
const user1Again = await getUser(1);
console.log('User 1 Again:', user1Again);
console.log('-------------------')
// Request for a user that was not fetched yet, will fetch from API
console.log('Sixth Request')
const user4 = await getUser(4);
console.log('User 4:', user4);
console.log('-------------------')
// Request for the second user again, will be served from the cache
console.log('Seventh Request')
const user2Again = await getUser(2);
console.log('User 2 Again:', user2Again);
console.log('-------------------')
// Request for a new user, will fetch from API, and the first user will be evicted from the cache
console.log('Eighth Request')
const user5 = await getUser(5);
console.log('User 5:', user5);
console.log('-------------------')
// Request for the first user again, will fetch from API because it was evicted
console.log('Ninth Request')
const user1AgainAgain = await getUser(1);
console.log('User 1 Again Again:', user1AgainAgain);
console.log('-------------------')
}
main();
How LRU Cache Works
When we first request user data, it comes from the API. But when we request the same user again, the data is pulled from the cache, making the request much faster. This reduces the load on the API and improves our application's performance.
The LRU Cache has a maximum size of 5. When we request a sixth user, the least recently used item (in this case, the first user) is removed from the cache to make room for the new data. If we then request the first user again, it has to be fetched from the API because it's no longer in the cache.
Benefits of Using LRU Cache
As you can see, when we request the same user data multiple times, it's served from the cache, making the requests much faster. This reduces the load on the API, improves application performance, and in many cases, it can save us a lot of resources and money.
I hope that you have found this post useful. If you have any questions or comments, please feel free to leave them below.
Top comments (0)