Your API is a front door. Without rate limiting, you've left it propped open with a sign that says "come on in, take whatever you want."
Why Bother
Rate limiting feels like infrastructure plumbing. Not the sexy part of building an API. But consider what happens without it:
A single user (or bot, or competitor, or angry teenager) hits your login endpoint 10,000 times in a minute. Your database melts. Your legitimate users see 502 errors. Your AWS bill triples overnight. If you're running a SaaS, your Slack lights up with customer complaints while you scramble to add a sleep() call in production.
That's the mild scenario. Credential stuffing attacks hit unthrottled login endpoints thousands of times per second. Scraper bots vacuum your product data and reprice against you on Amazon. Denial-of-wallet attacks rack up compute costs on serverless endpoints. API abuse is a business risk, not just a technical one.
Laravel gives you rate limiting out of the box. Most teams use maybe 10% of what's available, and configure it wrong.
The Default: throttle Middleware
Laravel ships with a throttle middleware that does the basics. In bootstrap/app.php (Laravel 11+) or RouteServiceProvider (older versions), you probably have something like this:
->withMiddleware(function (Middleware $middleware) {
$middleware->api(prepend: [
\Laravel\Sanctum\Http\Middleware\EnsureFrontendRequestsAreStateful::class,
]);
$middleware->throttleApi('60,1'); // 60 requests per minute
})
That gives you 60 requests per minute per user (or per IP for unauthenticated requests). It works. It's also a blunt instrument applied uniformly to endpoints with wildly different risk profiles.
Your /api/products listing endpoint and your /api/auth/login endpoint have the same rate limit. One returns cached data. The other validates credentials against bcrypt hashes. They should not have the same throttle.
Named Rate Limiters
Laravel's RateLimiter facade lets you define named limiters with real logic behind them. Put these in AppServiceProvider::boot():
use Illuminate\Cache\RateLimiting\Limit;
use Illuminate\Support\Facades\RateLimiter;
public function boot(): void
{
// Authentication: tight limits, keyed by IP + email
RateLimiter::for('login', function (Request $request) {
$email = strtolower($request->input('email', ''));
return [
// Per IP: 5 attempts per minute
Limit::perMinute(5)->by($request->ip()),
// Per email: 10 attempts per minute (across all IPs)
Limit::perMinute(10)->by($email),
];
});
// Password reset: even tighter
RateLimiter::for('password-reset', function (Request $request) {
return Limit::perMinute(3)->by($request->ip());
});
// General API: scale by plan
RateLimiter::for('api', function (Request $request) {
$user = $request->user();
if (! $user) {
return Limit::perMinute(10)->by($request->ip());
}
return match ($user->plan) {
'enterprise' => Limit::perMinute(1000)->by($user->id),
'business' => Limit::perMinute(300)->by($user->id),
'pro' => Limit::perMinute(100)->by($user->id),
default => Limit::perMinute(30)->by($user->id),
};
});
// Expensive operations: strict
RateLimiter::for('export', function (Request $request) {
return Limit::perHour(5)->by($request->user()?->id ?: $request->ip());
});
}
Then apply them to routes:
Route::post('/auth/login', [AuthController::class, 'login'])
->middleware('throttle:login');
Route::post('/auth/forgot-password', [AuthController::class, 'forgotPassword'])
->middleware('throttle:password-reset');
Route::middleware(['auth:sanctum', 'throttle:api'])->group(function () {
Route::get('/products', [ProductController::class, 'index']);
Route::post('/reports/export', [ReportController::class, 'export'])
->middleware('throttle:export'); // stacks with 'api'
});
The key insight: different endpoints have different threat models and should have different limits.
Multi-Dimensional Rate Limiting
The login limiter above shows the most important pattern: limiting on multiple dimensions simultaneously. A single key is easy to circumvent.
If you only limit by IP, an attacker uses a botnet or rotating proxies. If you only limit by email, an attacker tries one password against thousands of emails (credential spraying). Limiting by both IP and email makes both attacks harder.
RateLimiter::for('login', function (Request $request) {
$email = strtolower($request->input('email', ''));
return [
Limit::perMinute(5)->by('login-ip:' . $request->ip()),
Limit::perMinute(10)->by('login-email:' . $email),
// Global: detect distributed attacks
Limit::perMinute(100)->by('login-global'),
];
});
That third limiter — login-global — catches distributed credential stuffing where thousands of IPs each try one or two passwords. If your login endpoint gets 100 attempts in a minute from any combination of sources, something's wrong.
Response Headers: Tell Clients What's Happening
Laravel automatically sends rate limit headers on throttled routes:
X-RateLimit-Limit: 60
X-RateLimit-Remaining: 42
Retry-After: 28
When a client hits the limit, they get a 429 Too Many Requests response. That's correct behavior. What matters is what happens next.
Don't leak information in your 429 response. The default message is fine:
{
"message": "Too Many Attempts."
}
Don't do this:
{
"message": "Rate limit exceeded for user cooper@example.com. Try again in 28 seconds."
}
That confirms the email exists, tells the attacker exactly how long to wait, and gives them information they can use to optimize their attack cadence.
The Login Endpoint Problem
Login endpoints deserve their own section because they're the most abused endpoint in any API. Here's a thorough approach:
class LoginController extends Controller
{
public function login(Request $request): JsonResponse
{
$request->validate([
'email' => 'required|email',
'password' => 'required|string',
]);
$email = strtolower($request->email);
$throttleKey = 'login:' . $email . '|' . $request->ip();
// Check if already locked out
if (RateLimiter::tooManyAttempts($throttleKey, 5)) {
$seconds = RateLimiter::availableIn($throttleKey);
throw ValidationException::withMessages([
'email' => [
"Too many login attempts. Try again in {$seconds} seconds."
],
])->status(429);
}
if (! Auth::attempt($request->only('email', 'password'))) {
// Increment on failure only — successful logins don't count
RateLimiter::hit($throttleKey, 300); // 5-minute decay
throw ValidationException::withMessages([
'email' => ['The provided credentials are incorrect.'],
]);
}
// Clear the limiter on success
RateLimiter::clear($throttleKey);
$token = $request->user()->createToken('api');
return response()->json([
'token' => $token->plainTextToken,
]);
}
}
Key points: only count failures (successful logins clear the counter), use a 5-minute decay window instead of 1-minute (makes brute force painful), and give a generic error message that doesn't distinguish between "wrong password" and "user doesn't exist."
Sliding Windows vs Fixed Windows
Laravel's default rate limiter uses a fixed window. If your limit is 60/minute, the window resets at the top of each minute. An attacker can send 60 requests at 0:59 and 60 more at 1:00 — 120 requests in two seconds.
For most endpoints, fixed windows are fine. For sensitive ones, you want a sliding window. Laravel doesn't include one natively, but you can build it with Redis:
use Illuminate\Support\Facades\Redis;
class SlidingWindowRateLimiter
{
public function attempt(string $key, int $maxAttempts, int $windowSeconds): bool
{
$now = microtime(true);
$windowStart = $now - $windowSeconds;
return Redis::eval(<<<'LUA'
local key = KEYS[1]
local now = tonumber(ARGV[1])
local window_start = tonumber(ARGV[2])
local max_attempts = tonumber(ARGV[3])
local window_seconds = tonumber(ARGV[4])
-- Remove expired entries
redis.call('ZREMRANGEBYSCORE', key, '-inf', window_start)
-- Count current entries
local current = redis.call('ZCARD', key)
if current >= max_attempts then
return 0
end
-- Add this attempt
redis.call('ZADD', key, now, now .. ':' .. math.random(1000000))
redis.call('EXPIRE', key, window_seconds)
return 1
LUA, 1, $key, $now, $windowStart, $maxAttempts, $windowSeconds);
}
}
Using a sorted set with timestamps as scores gives you a true sliding window. Old entries get pruned on each check. The Redis EVAL command makes the whole operation atomic.
Plan-Based Throttling
If you're building a SaaS, your free tier and enterprise tier shouldn't have the same rate limits. Here's a middleware that reads limits from the user's plan:
namespace App\Http\Middleware;
use Closure;
use Illuminate\Cache\RateLimiting\Limit;
use Illuminate\Http\Request;
use Illuminate\Support\Facades\RateLimiter;
class ThrottleByPlan
{
private const PLAN_LIMITS = [
'free' => 30,
'starter' => 100,
'pro' => 500,
'business' => 2000,
'enterprise' => 10000,
];
public function handle(Request $request, Closure $next): mixed
{
$user = $request->user();
$plan = $user?->plan ?? 'free';
$limit = self::PLAN_LIMITS[$plan] ?? self::PLAN_LIMITS['free'];
$key = 'api-plan:' . ($user?->id ?? $request->ip());
if (RateLimiter::tooManyAttempts($key, $limit)) {
$retryAfter = RateLimiter::availableIn($key);
return response()->json([
'message' => 'Rate limit exceeded.',
'retry_after' => $retryAfter,
], 429)->withHeaders([
'Retry-After' => $retryAfter,
'X-RateLimit-Limit' => $limit,
'X-RateLimit-Remaining' => 0,
]);
}
RateLimiter::hit($key, 60);
$response = $next($request);
return $response->withHeaders([
'X-RateLimit-Limit' => $limit,
'X-RateLimit-Remaining' => max(0, $limit - RateLimiter::attempts($key)),
]);
}
}
Register it in your kernel or route middleware aliases and apply to API groups. Customers can see their limits in the response headers, and your billing page can reference the same constants.
Throttling Expensive Operations
Some endpoints are expensive — PDF generation, report exports, AI calls, file uploads. These need separate, tighter limits even for paid users:
RateLimiter::for('ai-analysis', function (Request $request) {
$user = $request->user();
// Daily limit based on plan (not per-minute)
$dailyLimit = match ($user?->plan) {
'enterprise' => 500,
'business' => 100,
'pro' => 25,
default => 5,
};
return Limit::perDay($dailyLimit)->by($user?->id ?: $request->ip());
});
Stack this on top of your general API throttle:
Route::post('/scan/analyze', [ScanController::class, 'analyze'])
->middleware(['throttle:api', 'throttle:ai-analysis']);
Both limiters apply independently. A user who's within their general API limit can still be blocked from AI analysis if they've burned through their daily allocation.
What HAVOC Catches
HAVOC's static analysis flags several rate-limiting gaps that are common in Laravel codebases:
Unthrottled authentication endpoints. Any route that accepts credentials — login, registration, password reset, API token creation — without rate limiting middleware gets flagged as high severity. This is the number one thing to fix.
Hardcoded rate limits. When limits are buried in middleware strings like throttle:60,1 instead of named limiters, HAVOC flags it as a maintainability issue. Hardcoded limits are easy to forget and hard to coordinate across endpoints.
Missing rate limits on write operations. POST, PUT, PATCH, and DELETE endpoints without throttling are flagged. Read endpoints are lower risk (though still worth throttling), but write endpoints can cause data corruption or resource exhaustion.
IP-only throttle keys on authenticated routes. If you're throttling authenticated users by IP instead of user ID, multiple users behind the same corporate NAT or VPN share a rate limit pool. HAVOC flags this because it creates a denial-of-service vector where one aggressive user blocks their entire office.
Beyond Rate Limiting: Defense in Depth
Rate limiting is one layer. A real API security posture includes:
Request validation. Reject malformed input before it reaches your business logic. Laravel's form requests and validation rules are your first line of defense, not your throttle middleware.
API authentication. Sanctum tokens, Passport OAuth, or API keys — pick one and enforce it. Unauthenticated API access should be the exception, not the rule.
Logging and alerting. When someone hits a rate limit, log it. When the same IP hits limits on five different endpoints in a minute, alert on it. Rate limiting stops the immediate damage; logging helps you understand the attack pattern.
IP blocking. For persistent abusers, escalate from throttling to blocking. fail2ban at the infrastructure level, or a middleware that checks a blocklist:
public function handle(Request $request, Closure $next): mixed
{
if (Cache::has('blocked-ip:' . $request->ip())) {
abort(403, 'Forbidden');
}
return $next($request);
}
WAF and CDN. Cloudflare, AWS WAF, or similar services handle volumetric attacks that your application-level rate limiting can't absorb. If someone throws 100,000 requests per second at you, your Laravel app won't even get the chance to return a 429.
Common Mistakes
Setting limits too high. A 1000/minute limit on login is barely a limit at all. An attacker can try 1000 passwords per minute — that's 1.4 million per day. For authentication endpoints, single digits per minute is appropriate.
Not varying limits by endpoint. A flat throttle:60,1 on everything means your search endpoint (cached, fast) and your export endpoint (database-heavy, slow) have the same limit. The export endpoint will kill your database long before you hit 60 requests.
Forgetting the decay window. Laravel's default decay is 1 minute. For login endpoints, that's too short. A 5-minute or 15-minute decay window makes brute force genuinely painful instead of mildly inconvenient.
Rate limiting in application code only. If your rate limiter runs in PHP, every request still hits your web server, boots Laravel, and runs middleware before getting rejected. For serious abuse, use nginx or your load balancer:
# nginx.conf
limit_req_zone $binary_remote_addr zone=login:10m rate=5r/s;
server {
location /api/auth/login {
limit_req zone=login burst=10 nodelay;
proxy_pass http://laravel;
}
}
This rejects excess requests before they even reach PHP.
Not testing your limits. Write a test that verifies your rate limiting works:
public function test_login_rate_limiting(): void
{
$email = 'test@example.com';
// Exhaust the limit
for ($i = 0; $i < 5; $i++) {
$this->postJson('/api/auth/login', [
'email' => $email,
'password' => 'wrong-password',
])->assertUnprocessable();
}
// Next attempt should be throttled
$this->postJson('/api/auth/login', [
'email' => $email,
'password' => 'wrong-password',
])->assertStatus(429);
}
The Bottom Line
Rate limiting isn't glamorous. It doesn't show up in feature demos or investor pitches. But it's the difference between an API that survives contact with the real internet and one that goes down the first time someone points a script at it.
Start with your authentication endpoints — they're the most targeted and the most dangerous to leave open. Then work outward: expensive operations, write endpoints, and finally read endpoints. Use named limiters so you can reason about your limits in one place. Test them. Monitor them.
Your database will thank you. Your ops team will thank you. Your customers will never notice, which is exactly the point.
Find your unthrottled endpoints automatically
HAVOC scans your Laravel codebase for unprotected authentication routes, missing rate limiting on write operations, and IP-keyed throttles on authenticated routes — plus dozens of other security gaps. The CLI is free and open-source.
npm install -g @havoc/cli
cd your-laravel-project
havoc scan
Start Free — No Credit Card →
or
View on GitHub