A centralized, distributed rate limiting service that protects shared APIs from abuse and traffic spikes, enforcing consistent rules (per user / IP / API key) across multiple application instances.
Built as an independent service, it allows backend teams to apply consistent traffic control rules without embedding rate-limiting logic into each application.
Use this service when:
- Multiple services share the same downstream API and need consistent throttling rules
- You want centralized rule management instead of app by app configuration drift
- You need low-latency enforcement that behaves safely under partial failures (explicit fail-closed behavior)
- Java 17 (Spring Boot 4.x)
- Redis (Spring Data Redis /
StringRedisTemplate) - Token Bucket rate limiting using atomic Redis Lua scripts
- Observability: Micrometer + Prometheus + Grafana
- Containerization: Docker + Docker Compose
- Testing: Spring Boot Test + integration tests for fail-closed behavior
- Distributed Token Bucket rate limiting with atomic Redis + Lua execution
- Central rule management (create, update, fetch rules)
- Two enforcement styles:
/v1/check— decision API (allow / deny + remaining tokens + retryAfter)/v1/enforce— gateway-friendly API (204 / 429 + RateLimit headers)
- Explicit fail-closed behavior when Redis is unavailable
- Metrics for allowed vs blocked requests and fail-closed reasons (Micrometer → Prometheus)
Rules are stored centrally in Redis and referenced by ruleId.
Each request is evaluated using a Token Bucket stored in Redis and keyed by:
Enforcement executes a Redis Lua script to:
-
Refill tokens
-
Consume request cost
in a single atomic operation, returning: -
allowed -
retryAfterMs -
remaining
If Redis (or the limiter execution path) is unavailable, the service intentionally fails closed to protect downstream systems.
Stores or updates a rule definition.
Returns the current rule configuration.
Returns allow / deny decision plus retryAfterMs and remaining tokens.
Returns:
204 No Contentwhen allowed (with RateLimit headers)429 Too Many Requestswhen blocked (with RateLimit headers +Retry-After)
You can run this service using Docker (recommended), or locally via Maven.
This repo includes a docker-compose.yml that runs:
- Redis
traffic-service(built from this repo)
docker compose up --buildIf you have a Dockerfile in the repo root:
docker build -t traffic-service:local .
docker run --rm -p 8081:8081
-e SPRING_DATA_REDIS_HOST=host.docker.internal
-e SPRING_DATA_REDIS_PORT=6379
traffic-service:local