Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
125 changes: 125 additions & 0 deletions docs/codedocs/algorithms.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,125 @@
---
title: "Algorithms"
description: "How fixed window, sliding window, token bucket, and cached fixed window algorithms work internally."
---

Algorithms define how tokens are counted and when requests are allowed. In this library, algorithms are factory functions that return an object with `limit`, `getRemaining`, and `resetTokens` methods. The factories live in `src/single.ts` (single region) and `src/multi.ts` (multi region), and their Redis logic lives in `src/lua-scripts/`.

Each algorithm receives a `Context` with a Redis client, a key prefix, and optional cache. The algorithm then calls `safeEval` from `src/hash.ts`, which uses `EVALSHA` with a precomputed hash and falls back to `EVAL` if the script isn’t loaded.

```mermaid
flowchart TD
A[limit(identifier)] --> B{Algorithm}
B -->|fixedWindow| C[INCRBY + PEXPIRE]
B -->|slidingWindow| D[GET current + GET previous]
B -->|tokenBucket| E[HMGET refilledAt/tokens]
C --> F[Compare against limit]
D --> F
E --> F
F --> G[Return success/remaining/reset]
```

## Fixed window
Fixed window is implemented in `src/single.ts` with `SCRIPTS.singleRegion.fixedWindow.*` in `src/lua-scripts/single.ts`. It increments a counter for the current window and rejects when the counter exceeds the limit. The Lua script sets the key’s expiration the first time it is created so each bucket is self‑cleaning.

**Basic usage**
```ts title="app/ratelimit.ts"
import { Ratelimit } from "@upstash/ratelimit";
import { Redis } from "@upstash/redis";

const ratelimit = new Ratelimit({
redis: Redis.fromEnv(),
limiter: Ratelimit.fixedWindow(100, "1 m")
});

const res = await ratelimit.limit("api_key_123");
```

**Advanced usage (dynamic limits)**
```ts title="app/dynamic.ts"
import { Ratelimit } from "@upstash/ratelimit";
import { Redis } from "@upstash/redis";

const ratelimit = new Ratelimit({
redis: Redis.fromEnv(),
limiter: Ratelimit.fixedWindow(60, "1 m"),
dynamicLimits: true
});

await ratelimit.setDynamicLimit({ limit: 120 });
const res = await ratelimit.limit("user_42");
```

## Sliding window
Sliding window blends current and previous windows to reduce boundary bursts. The Lua script reads both buckets, weights the previous window by how far into the current window you are, and then calculates remaining tokens. See `SCRIPTS.singleRegion.slidingWindow.*` in `src/lua-scripts/single.ts`.

**Basic usage**
```ts title="app/ratelimit.ts"
const ratelimit = new Ratelimit({
redis: Redis.fromEnv(),
limiter: Ratelimit.slidingWindow(10, "10 s")
});
```

**Edge case (refunds)**
If you pass a negative `rate`, the algorithm treats it as a refund and skips cache blocking. This is handled in `src/single.ts` by checking `incrementBy > 0` before consulting the cache.

```ts title="app/refund.ts"
const res = await ratelimit.limit("order_77", { rate: -1 });
```

## Token bucket
Token bucket in `src/single.ts` uses a Redis hash to store `refilledAt` and `tokens`. The Lua script refills tokens based on elapsed time, then decrements by the request rate. See `tokenBucketLimitScript` in `src/lua-scripts/single.ts`.

**Basic usage**
```ts title="app/ratelimit.ts"
const ratelimit = new Ratelimit({
redis: Redis.fromEnv(),
limiter: Ratelimit.tokenBucket(5, "10 s", 20)
});
```

**Advanced usage (higher burst)**
```ts title="app/burst.ts"
const ratelimit = new Ratelimit({
redis: Redis.fromEnv(),
limiter: Ratelimit.tokenBucket(2, "1 s", 10)
});
```

## Cached fixed window
`cachedFixedWindow` is a special case that requires an ephemeral cache. It checks the local cache first, increments it optimistically, and updates Redis in the background. This is implemented in `src/single.ts` and uses `cachedFixedWindow*` scripts in `src/lua-scripts/single.ts`.

**Basic usage**
```ts title="app/worker.ts"
const cache = new Map();
const ratelimit = new Ratelimit({
redis: Redis.fromEnv(),
limiter: Ratelimit.cachedFixedWindow(5, "5 s"),
ephemeralCache: cache
});
```

**Advanced usage (fail fast)**
```ts title="app/worker.ts"
try {
const res = await ratelimit.limit("ip:10.0.0.1");
if (!res.success) return new Response("blocked", { status: 429 });
} catch (error) {
// cachedFixedWindow throws if no cache is provided
}
```

<Callout type="warn">`cachedFixedWindow` requires a cache (`ephemeralCache`). If you create the `Ratelimit` instance inside a request handler, the cache resets on every request and you lose the speed benefits. Create the instance outside your handler in serverless or edge functions.</Callout>

<Accordions>
<Accordion title="Fixed Window vs Sliding Window">
Fixed window is cheaper in Redis because it touches a single key per identifier, while sliding window reads two keys and applies a weighting step. That extra read means slightly higher latency, but it produces smoother limiting at window boundaries. If you expect burst traffic aligned to boundaries (cron jobs, marketing campaigns), sliding window reduces spikes. If cost and simplicity matter more than boundary behavior, fixed window is the pragmatic choice.
</Accordion>
<Accordion title="Token Bucket Trade-offs">
Token bucket provides steady throughput and allows bursts by setting `maxTokens` larger than the refill rate, which is excellent for user‑driven traffic. Internally it stores a hash per identifier and updates both `refilledAt` and `tokens`, so it is more stateful than fixed or sliding windows. If you refund tokens with a negative `rate`, the bucket can exceed the refill rate temporarily, which is useful for compensating failures. The trade-off is extra logic and a more complex reset behavior compared to time-bucketed counters.
</Accordion>
<Accordion title="Cached Fixed Window Caveats">
Cached fixed window removes Redis from the critical path on cache hits, which is ideal for hot identifiers in edge environments. However, because the cache is local, consistency is best‑effort and depends on the lifecycle of the runtime. If you run multiple isolates or regions, each has its own cache and can allow more requests than expected until Redis updates converge. Use it only when you can tolerate soft limits and you run in a single isolate or a small number of replicas.
</Accordion>
</Accordions>
84 changes: 84 additions & 0 deletions docs/codedocs/api-reference/analytics.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,84 @@
---
title: "Analytics"
description: "Analytics helper for recording and aggregating rate limit events."
---

`Analytics` in `src/analytics.ts` wraps `@upstash/core-analytics` and provides a higher‑level interface tailored to rate limit events. It is created automatically by `Ratelimit` when `analytics: true`, but you can also instantiate it directly for custom reporting.

## Constructor
```ts title="src/analytics.ts"
new Analytics(config: AnalyticsConfig)
```

**Parameters**
| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `redis` | `@upstash/redis` client | — | Redis REST client used for analytics storage. |
| `prefix` | `string` | `@upstash/ratelimit` | Namespace for analytics keys. |

## Methods
### `extractGeo`
```ts title="src/analytics.ts"
extractGeo(req: { geo?: Geo; cf?: Geo }): Geo
```
Extracts geo metadata from either `req.geo` (Vercel) or `req.cf` (Cloudflare). If neither is present, returns an empty object.

**Example**
```ts title="app/geo.ts"
const geo = analytics.extractGeo({ cf: { country: "US", city: "NYC" } });
```

### `record`
```ts title="src/analytics.ts"
record(event: Event): Promise<void>
```
Records a single event into the `events` table with identifier, time, success state, and optional geo data.

**Example**
```ts title="app/record.ts"
await analytics.record({
identifier: "user_123",
time: Date.now(),
success: true,
country: "US"
});
```

### `series`
```ts title="src/analytics.ts"
series(filter: TFilter, cutoff: number): PromiseAggregate[]>
```
Aggregates counts over time for a given field (e.g. identifier, country).

### `getUsage`
```ts title="src/analytics.ts"
getUsage(cutoff?: number): PromiseRecord<string, { success: number; blocked: number }>>
```
Returns allowed vs blocked counts grouped by identifier.

### `getUsageOverTime`
```ts title="src/analytics.ts"
getUsageOverTime(timestampCount: number, groupby: TFilter): PromiseAggregate[]>
```
Aggregates usage over time for a given field.

### `getMostAllowedBlocked`
```ts title="src/analytics.ts"
getMostAllowedBlocked(timestampCount: number, getTop?: number, checkAtMost?: number): PromiseAggregate[]>
```
Returns top identifiers by allowed/blocked counts.

## Usage with Ratelimit
If you enable analytics in `Ratelimit`, the library calls `analytics.record` after each request and attaches the work to the `pending` promise in the response.

```ts title="app/ratelimit.ts"
const ratelimit = new Ratelimit({
redis: Redis.fromEnv(),
limiter: Ratelimit.slidingWindow(10, "10 s"),
analytics: true
});
```

**Related**
- [Request Lifecycle](../lifecycle)
- [Ratelimit](./ratelimit)
67 changes: 67 additions & 0 deletions docs/codedocs/api-reference/ip-deny-list.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
---
title: "IpDenyList"
description: "Helpers for managing the IP deny list and its refresh lifecycle."
---

`IpDenyList` is exported as a module (`export * as IpDenyList`) from `src/deny-list/ip-deny-list.ts`. It provides functions for refreshing and disabling the IP deny list stored in Redis. These helpers are primarily used internally when protection is enabled but can be called directly in operational workflows.

## Functions
### `updateIpDenyList`
```ts title="src/deny-list/ip-deny-list.ts"
updateIpDenyList(redis: Redis, prefix: string, threshold: number, ttl?: number): Promise<unknown>
```

**Parameters**
| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `redis` | `Redis` | — | Redis REST client. |
| `prefix` | `string` | — | Ratelimit key prefix (default is `@upstash/ratelimit`). |
| `threshold` | `number` | — | Allowed range 1–8. Higher means stricter IP inclusion. |
| `ttl` | `number` | computed | Optional TTL for the status key, otherwise time until next 2 AM UTC. |

**Behavior**
- Fetches a public IP list based on the `threshold` level.
- Removes the old IP list from the combined deny list set.
- Replaces the IP list set and makes it disjoint from custom deny list entries.
- Updates a status key with TTL for future refresh checks.

**Example**
```ts title="app/ops.ts"
import { IpDenyList } from "@upstash/ratelimit";
import { Redis } from "@upstash/redis";

const redis = Redis.fromEnv();
await IpDenyList.updateIpDenyList(redis, "@upstash/ratelimit", 6);
```

### `disableIpDenyList`
```ts title="src/deny-list/ip-deny-list.ts"
disableIpDenyList(redis: Redis, prefix: string): Promise<unknown>
```

**Parameters**
| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `redis` | `Redis` | — | Redis REST client. |
| `prefix` | `string` | — | Ratelimit key prefix. |

**Behavior**
- Removes the IP list set from the combined deny list.
- Deletes the IP list set.
- Sets the status key to `disabled` with no TTL.

**Example**
```ts title="app/ops.ts"
await IpDenyList.disableIpDenyList(redis, "@upstash/ratelimit");
```

## Errors
### `ThresholdError`
```ts title="src/deny-list/ip-deny-list.ts"
class ThresholdError extends Error
```
Thrown when `threshold` is outside the allowed range of 1–8.

**Related**
- [Protection and Deny Lists](../protection-denylist)
- [Ratelimit](./ratelimit)
68 changes: 68 additions & 0 deletions docs/codedocs/api-reference/multi-region-ratelimit.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,68 @@
---
title: "MultiRegionRatelimit"
description: "Multi-region rate limiter with background synchronization and low-latency reads."
---

`MultiRegionRatelimit` in `src/multi.ts` extends the base `Ratelimit` class but uses an array of Redis REST clients (one per region). Each request is issued to every region, the first response wins, and synchronization runs asynchronously.

## Constructor
```ts title="src/multi.ts"
new MultiRegionRatelimit(config: MultiRegionRatelimitConfig)
```

**Parameters**
| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `redis` | `Redis[]` | — | Array of `@upstash/redis` clients, one per region. |
| `limiter` | `AlgorithmMultiRegionContext>` | — | Algorithm factory, typically `MultiRegionRatelimit.fixedWindow` or `slidingWindow`. |
| `prefix` | `string` | `@upstash/ratelimit` | Key prefix for Redis. |
| `ephemeralCache` | `Map<string, number> \| false` | auto Map | Optional local cache to short‑circuit blocked identifiers. |
| `timeout` | `number` | `5000` | Milliseconds to wait before returning a timeout response. |
| `analytics` | `boolean` | `false` | Enable analytics submission. |
| `dynamicLimits` | `boolean` | `false` | Not supported for multi‑region; ignored with a warning. |

## Methods
### `limit`
```ts title="src/ratelimit.ts"
limit(identifier: string, req?: LimitOptions): PromiseRatelimitResponse>
```
Behaves like the single‑region `limit`, but returns `pending` that includes synchronization across regions.

**Example**
```ts title="app/edge.ts"
const res = await ratelimit.limit("api_key_123");
context.waitUntil(res.pending);
```

### `blockUntilReady`
```ts title="src/ratelimit.ts"
blockUntilReady(identifier: string, timeout: number): PromiseRatelimitResponse>
```

### `getRemaining`
```ts title="src/ratelimit.ts"
getRemaining(identifier: string): Promise<{ remaining: number; reset: number; limit: number }>
```

### `resetUsedTokens`
```ts title="src/ratelimit.ts"
resetUsedTokens(identifier: string): Promise<void>
```

### `setDynamicLimit` and `getDynamicLimit`
These methods are inherited but not supported by multi‑region algorithms. If you enable `dynamicLimits` in the constructor you will receive a warning and the algorithms will ignore the dynamic limit key.

## Static algorithm factories
### `fixedWindow`
```ts title="src/multi.ts"
MultiRegionRatelimit.fixedWindow(tokens: number, window: Duration): AlgorithmMultiRegionContext>
```

### `slidingWindow`
```ts title="src/multi.ts"
MultiRegionRatelimit.slidingWindow(tokens: number, window: Duration): AlgorithmMultiRegionContext>
```

**Related**
- [Ratelimit](./ratelimit)
- [Multi-Region Consistency](../multi-region)
Loading