Rate Limiting
Learn how to set up Redis for efficient rate limiting with automatic fallback
Dirstarter uses rate-limiter-flexible with Redis for efficient rate limiting. This provides a flexible solution for protecting your APIs and routes from abuse, with automatic in-memory fallback when Redis is unavailable.
This integration is optional. If Redis is not configured, rate limiting will automatically fall back to in-memory storage.
Redis Setup
Redis is used for distributed rate limiting across multiple server instances. You can use any standard Redis provider:
- Railway - Simple Redis hosting with generous free tier
- Upstash - Serverless Redis with pay-per-use pricing
- AWS ElastiCache - Managed Redis for AWS infrastructure
- DigitalOcean Redis - Managed Redis clusters
- Self-hosted Redis - Run your own Redis instance
Configuration
Add the following environment variable to connect to your Redis instance:
# Format: redis://[username:password@]host:port
REDIS_URL="redis://localhost:6379"If REDIS_URL is not set, rate limiting will automatically use in-memory storage. This works fine for single-server deployments but won't share state across multiple instances.
Implementation
Redis Service
The Redis client is configured with automatic error handling and connection pooling:
import Redis from "ioredis"
import { env } from "~/env"
const createRedisClient = () => {
if (!env.REDIS_URL) {
return null
}
try {
return new Redis(env.REDIS_URL, {
maxRetriesPerRequest: 3,
lazyConnect: true,
})
} catch (error) {
console.error("Failed to create Redis client:", error)
return null
}
}
export const redis = createRedisClient()Rate Limit Configuration
Rate limits are centralized in a configuration file for easy customization:
export const rateLimitConfig = {
actions: {
// 3 submissions per 24 hours
submission: { points: 3, duration: 24 * 60 * 60 },
// 3 newsletter signups per 24 hours
newsletter: { points: 3, duration: 24 * 60 * 60 },
// 3 reports per hour
report: { points: 3, duration: 60 * 60 },
// 5 claim attempts per hour
claim: { points: 5, duration: 60 * 60 },
},
}Rate Limiter
The rate limiter automatically uses Redis when available, with an in-memory fallback:
import { RateLimiterMemory, RateLimiterRedis } from "rate-limiter-flexible"
import { rateLimitConfig } from "~/config/rate-limit"
import { redis } from "~/services/redis"
type RateLimitAction = keyof typeof rateLimitConfig.actions
const createLimiter = (action: RateLimitAction) => {
const config = { keyPrefix: `rl:${action}`, ...rateLimitConfig.actions[action] }
const limiter = new RateLimiterMemory(config)
if (redis) {
return new RateLimiterRedis({
storeClient: redis,
insuranceLimiter: limiter, // Fallback if Redis fails
...config,
})
}
return limiter
}
const limiters = {
submission: createLimiter("submission"),
report: createLimiter("report"),
newsletter: createLimiter("newsletter"),
claim: createLimiter("claim"),
}Usage
Use the isRateLimited helper in your server actions:
import { isRateLimited } from "~/lib/rate-limiter"
export async function submitAction() {
if (await isRateLimited("submission")) {
throw new Error("Too many submissions. Please try again later.")
}
// Your submission logic here
}The rate limiter automatically:
- Identifies users by their IP address
- Returns
trueif the user has exceeded the limit - Fails open (returns
false) if there's an error, to avoid blocking legitimate users
Custom Identifiers
You can optionally provide a custom key prefix or identifier:
// Use a custom key prefix
await isRateLimited("submission", "premium-submit")
// Use a custom identifier (e.g., user ID instead of IP)
await isRateLimited("submission", undefined, userId)Last updated on