Rate limiting plays a crucial role in preventing abuse by controlling the number of requests a user can make within a certain period. While, to keep track of requests and ips, we need to store them in-memory for ease of access and speed in response. ⛔✋
Running your Next.js app on Vercel’s serverless or edge runtimes offers incredible speed and scalability. These environments handle requests in isolation, spinning up lightweight instances as needed to deliver lightning-fast responses. However, this architecture has a key limitation: data stored in an instance’s memory isn’t accessible to others, as each instance is stateless.
To address this, we often need external solutions like Redis, a high-performance in-memory database. Services like Upstash Redis complement Vercel perfectly by enabling shared, low-latency data access across instances.
With that foundation set, let’s dive into the implementation!
Method 1: Upstash Redis 🙌
In your Vercel project page, go to Storage
tab.
Click on Create Database
button, then select Upstash for Redis:
After creating the db, you'll see the list of secrets.
Keep in mind that these secret variables are already bound to your vercel project and you don't need to redefine them, but for local development you will need to copy them into your .env.local
file.
Don't forget to put this line in your
.gitignore
:
.env*.local
Install the following modules:
$ npm i @upstash/redis @upstash/ratelimit
Create a ratelimit instance and use it
// route.ts
import { Ratelimit } from "@upstash/ratelimit";
import { Redis } from "@upstash/redis";
const rateLimiter = new Ratelimit({
limiter: Ratelimit.fixedWindow(1, "60 s"),
redis: Redis.fromEnv(),
analytics: true,
prefix: "@upstash/ratelimit",
});
const getIP = (req: Request) => {
const ip =
req.headers.get("cf-connecting-ip") ||
req.headers.get("x-forwarded-for") ||
undefined;
return Array.isArray(ip) ? ip[0] : ip;
};
export async function POST(req: Request) {
// ...
// Here we check if limit hit! 👇
const { success } = await rateLimiter.limit(ip || "unknown");
if (!success) return Response.json({ error: "Too many requests" }, { status: 429 } );
// ...
}
And done!
Method 2: Tile38 self-hosted
This method is not relied on third-party services with their limitations but requires you to have a server.
Install docker on your server if not installed already.
Run Tile38 docker with custom configs
$ docker run --name tile38 -d -p 9851:9851 tile38/tile38
Config ufw-docker
to open related port
$ sudo ufw-docker allow tile38
Or you can use
reverse-proxy
onnginx
if you want to keep it behind a domain.
Now that everything is set, it's time for usage.
To bypass the limitations, we can create our own adapter as below:
// route.ts
import { Tile38 } from "@iwpnd/tile38-ts";
import { Ratelimit } from "@upstash/ratelimit";
// Tile38 HTTP API URL
const tile38 = new Tile38("http://your_server_ip:9851");
// Custom adapter for Tile38
const customTile38Adapter = {
// Custom function to simulate Lua script execution via HTTP API
evalsha: async (script: any, keys: any, args: any) => {
try {
const key = keys[0]; // Assuming we're using a single key for rate-limiting
const value = args[0]; // The value used for rate-limiting
const { ok } = await tile38.jSet("ratelimit", key, "root", value);
return { success: ok };
} catch (error) {
return { success: false };
}
},
get: async () => null,
set: async () => null,
};
// Initialize the rate limiter with Tile38 as the adapter
const ratelimit = new Ratelimit({
redis: customTile38Adapter, // Using the custom adapter
limiter: Ratelimit.fixedWindow(1, '10s'), // 10 requests per 10 seconds
});
...
Congratulations! 🎉 You’ve unlocked the power of combining Vercel’s serverless capabilities with Upstash Ratelimit. This setup will not only elevate your app’s performance but also open the door to building highly scalable, fast, and reliable applications. With this powerful foundation, your Next.js app is ready to handle whatever comes next—smoothly and efficiently. 🚀
Top comments (0)