Build a Custom Rate Limiter in Next.js and Keep Your APIs Rock‑Solid

Build a Custom Rate Limiter in Next.js and Keep Your APIs Rock‑Solid
Before we dive in, why not take a quick peek at my portfolio: https://priyalraj.com, & while you’re at it, check out my SaaS venture: https://shavelinks.com, it’s where innovation meets simplicity.

📋 TL;DR — What You’ll Learn Here

🔒 What Is a Rate Limiter (and Why You Can’t Skip It)

Imagine a nightclub on a Friday night — people are lining up, music’s pumping, and everyone wants in. But there’s a bouncer at the door. Their job? Let in a manageable number of guests, keep troublemakers out, and make sure the club doesn’t get overcrowded.

Now, think of your API as that nightclub. Without a rate limiter — your digital bouncer — anyone can bombard your server with endless requests. This can crash your backend, drain your resources, or open the door to abuse and attacks.

A rate limiter enforces a rule like “Only 100 requests allowed per minute.” It watches each incoming request and decides whether to let it through or block it. It’s a simple idea, but it has a huge impact.

Here’s why you need a rate limiter in your APIs:

🛡️ Blocks brute‑force login attacks

Stop attackers from hammering your login endpoint 1000x per second. Services like DataDomerely on this to protect against bots and credential stuffing.

📈 Smooths out traffic spikes

When traffic surges, rate limiting absorbs the shock. Platforms like Medium use it to keep their APIs stable, even when an article goes viral.

🚀 Improves UX by keeping servers fast

Slow APIs = angry users. Cloudflare uses rate limiting to keep its infrastructure speedy and error-free during traffic storms.

💸 Reduces cost on serverless platforms

Every request costs something — rate limiting cuts waste, especially on platforms like Vercel or AWS Lambda, where every millisecond matters.

Visual Flow Overview

[User Request] --> [Rate Limiter Check]
       |                       |
       |-- Allowed --> [Process Request] --> [Response]
       |
       |-- Blocked --> [429 Too Many Requests Response]

🛠️ How to Build Your Custom Rate Limiter in Next.js

We’re going to build a Fixed Window Counter style rate limiter in Next.js, designed specifically to run inside API routes.

Instead of wiring the logic directly into each route, we’ll create a reusable utility file: 📁 lib/rateLimiter.ts

This approach keeps your code modular, clean, and scalable, making it easy to apply rate limiting across multiple endpoints without repetition.

✅ Step 1: Create rateLimiter.ts Inside lib/

import { NextRequest, NextResponse } from 'next/server';

interface RateLimitConfig {
    maxRequests: number;
    windowMs: number;
    message?: string;
}
interface RateLimitRecord {
    count: number;
    resetTime: number;
}
// Global store for rate limits
const rateLimitStore = new Map<string, RateLimitRecord>();
const createRateLimiter = (config: RateLimitConfig) => {
    const finalConfig = {
        message: 'Too many requests from this IP, please try again later.',
        ...config
    };
    const getClientIp = (request: NextRequest): string => {
        const forwardedFor = request.headers.get('x-forwarded-for');
        if (forwardedFor) {
            return forwardedFor.split(',')[0].trim();
        }
        return request.ip || 'unknown';
    };
    const isRateLimited = (ip: string): boolean => {
        const now = Date.now();
        const record = rateLimitStore.get(ip);
        // Clean up expired records
        rateLimitStore.forEach((value, key) => {
            if (now > value.resetTime) {
                rateLimitStore.delete(key);
            }
        });
        if (!record || now > record.resetTime) {
            // Reset or create new record
            rateLimitStore.set(ip, {
                count: 1,
                resetTime: now + finalConfig.windowMs
            });
            return false;
        }
        // Increment count
        record.count++;
        rateLimitStore.set(ip, record);
        return record.count > finalConfig.maxRequests;
    };
    const check = (request: NextRequest): { isLimited: boolean; response?: NextResponse } => {
        const ip = getClientIp(request);
        if (isRateLimited(ip)) {
            return {
                isLimited: true,
                response: NextResponse.json(
                    { message: finalConfig.message, status: 429 },
                    { status: 429 }
                )
            };
        }
        return { isLimited: false };
    };
    const clear = (): void => {
        rateLimitStore.clear();
    };
    return {
        check,
        clear
    };
};
// Create rate limiters with specific request counts per minute
export const createRequestsMinuteLimiter = (requestsPerMinute: number, windowMs: number = 60 * 1000) =>
    createRateLimiter({
        maxRequests: requestsPerMinute,
        windowMs,
        message: `Too many requests. Please try again in ${windowMs / 1000} seconds.`
    });
// Export the createRateLimiter function for custom configurations
export { createRateLimiter };

✅ Step 2: Use It Inside Your API Route

A. Basic Setup (Default Configuration):

Let’s say you want to limit requests to 10 per minute on a simple API route. In this case, you’ll configure the rate limiter with the default settings. The default rate limit is 10 requests per minute, and the default time window is 1 minute (60,000 ms). Example: Limit to 10 Requests Per Minute.

// app/api/my-api/route.ts
import { NextRequest, NextResponse } from 'next/server';
import { createRequestsMinuteLimiter } from '@/lib/rateLimiter';

// Limit: 10 requests per minute (default window: 60,000ms)
const limiter = createRequestsMinuteLimiter(10);
export async function GET(request: NextRequest) {
  const { isLimited, response } = limiter.check(request);
  if (isLimited) return response!;
  return NextResponse.json({ message: 'Success ✅' });
}

Explanation for the above code:

This setup is ideal for simple, low-traffic API routes where you want to impose a basic rate limit without any additional complexities.

B. Customising the Time Window (Custom Window):

If you want more flexibility, you can customise the time window for your rate limiter. Instead of using the default 1-minute window, you can set a custom time window that suits your needs. Example: Limit to 10 Requests Per 30 Seconds.

// app/api/my-api/route.ts
import { NextRequest, NextResponse } from 'next/server';
import { createRequestsMinuteLimiter } from '@/lib/rateLimiter';

// Limit: 10 requests per 30 seconds
const limiter = createRequestsMinuteLimiter(10, 30000);
export async function GET(request: NextRequest) {
  const { isLimited, response } = limiter.check(request);
  if (isLimited) return response!;
  return NextResponse.json({ message: 'Success ✅' });
}

Explanation for the above code:

This custom window configuration is helpful when you need a shorter or longer window to better match the behaviour of your API or the expected traffic patterns.

⚠️ Important Note: In-Memory Rate Limiting Fails on Serverless (Like Vercel)

Why It Fails:

Serverless platforms like Vercel spin up stateless functions to handle each request — they don’t share memory between requests. Since our rate limiter uses an in-memory Map, every new request may hit a fresh instance with a fresh Map. So, the count resets every time, and the limiter gets bypassed.

Pros of In-Memory Rate Limiting:

Cons in Serverless:

What’s the Solution?

If you’re deploying to serverless, replace the Map with a shared store like:

These stores maintain request state across invocations, making your rate limiter production-safe.

Real‑World Use Cases for Your Custom Rate Limiter

When you’re building your custom rate limiter in Next.js, there are various real-world use cases where your custom rate limiting logic can be a lifesaver:

Pros and Cons of Your Custom Rate Limiter

Pros:

Cons:

🧪 Testing and Logging Your Custom Rate Limiter

After implementing your custom rate limiter, it’s critical to test and log its behaviour to ensure it performs as expected:

✅ Testing Tips:

📝 Logging Suggestions:

Optional Enhancements for Your Custom Rate Limiter

To further enhance your custom rate limiter, consider adding these optional features:

Conclusion

Building your custom rate limiter in Next.js is an excellent way to gain full control over how traffic interacts with your APIs. With a modular and configurable setup, you can scale your solution to handle different traffic patterns and user behaviours.

As you continue to refine your rate limiter, always remember that the goal is to balance protecting your resources and providing a smooth experience for legitimate users. By applying the techniques and configurations discussed here, you’ll ensure your APIs stay rock-solid, even under heavy traffic.

If you enjoyed this article, please make sure to Like, Comment and follow me on Twitter.
Made with ❤️ by Priyal Raj