RATE-001 critical general

Rate limiting configured

Rate limiting exists at infrastructure or application level with proper client isolation

Question to ask

"Could one angry script hit your API until it falls over?"

Verification guide

Severity: Critical

Rate limiting must exist at either infrastructure level (Cloudflare, nginx, API Gateway) or application level. Limits should isolate clients so one user can't exhaust limits for others.

Check automatically:

  1. Check for infrastructure-level rate limiting:
# Cloudflare - check for rate limiting rules (requires API access)
# Look for Cloudflare config in repo
grep -rE "rate_limit|ratelimit" cloudflare/ wrangler.toml 2>/dev/null

# nginx rate limiting
grep -rE "limit_req|limit_conn" nginx/ conf/ *.conf 2>/dev/null

# AWS API Gateway
grep -rE "throttle|rateLimit|quotaSettings" serverless.yml sam.yaml cloudformation/ terraform/ 2>/dev/null

# Traefik rate limiting
grep -rE "rateLimit|averageRate" traefik/ 2>/dev/null
  1. Check for application-level rate limiting libraries:
# Node.js
grep -E "express-rate-limit|rate-limiter-flexible|bottleneck|p-limit|@fastify/rate-limit" package.json 2>/dev/null

# Python
grep -E "django-ratelimit|Flask-Limiter|slowapi|limits" requirements*.txt pyproject.toml 2>/dev/null

# Go
grep -E "golang.org/x/time/rate|go-redis/redis_rate|ulule/limiter" go.mod 2>/dev/null

# Ruby
grep -E "rack-attack|redis-throttle" Gemfile 2>/dev/null
  1. Check rate limit configuration in code:
# Express rate limit setup
grep -rE "rateLimit\(|RateLimit\(|createRateLimiter" src/ lib/ app/ --include="*.ts" --include="*.js" 2>/dev/null

# Rate limit middleware registration
grep -rE "app\.use.*rate|rateLimiter|throttle" src/ lib/ app/ --include="*.ts" --include="*.js" 2>/dev/null

# Redis-based rate limiting (common pattern)
grep -rE "redis.*rate|rate.*redis|limiter.*redis" src/ lib/ app/ 2>/dev/null
  1. Check rate limit keying strategy (client isolation):
# Key generators - should key on IP, user ID, or API key
grep -rE "keyGenerator|key.*req\.ip|key.*user|key.*apiKey|getKey" src/ lib/ app/ 2>/dev/null

# Check for user-based limits
grep -rE "req\.user|request\.user|userId|user_id" src/ lib/ app/ 2>/dev/null | grep -i rate
  1. Check for auth-differentiated limits (recommended):
# Different limits for authenticated vs anonymous
grep -rE "authenticated.*limit|anon.*limit|public.*limit|skip.*auth" src/ lib/ app/ 2>/dev/null | grep -i rate

# Conditional rate limiting
grep -rE "skip:|skip.*=>|skipIf|skipFailedRequests" src/ lib/ app/ 2>/dev/null

Ask user:

  • "Where is rate limiting configured?" (Cloudflare, nginx, app code, etc.)
  • "What do you rate limit on?" (IP, user ID, API key)
  • "Do authenticated users have different limits than anonymous?"

Cross-reference with:

  • RATE-002 (graceful handling when limits hit)
  • RATE-003 (documentation of limits)
  • AUTH-001 (authentication system)
  • INFRA-001 (Cloudflare configuration)

Pass criteria:

  • Rate limiting exists at infrastructure OR application level
  • Clients are isolated (keyed on IP, user, or API key - not a single global counter)
  • Authenticated users have different limits than anonymous (recommended, not required)

Fail criteria:

  • No rate limiting found at any level
  • Single global counter that one client can exhaust for everyone
  • Public endpoints completely unprotected

Evidence to capture:

  • Rate limiting mechanism (Cloudflare, nginx, app library, etc.)
  • Keying strategy (IP, user ID, API key)
  • Sample limits configured (requests per minute/hour)
  • Whether auth differentiation exists

Section

30. Rate Limiting

API & Security