Skip to content

Commit ac292d9

Browse files
committed
rewrite the rate limiting guide
1 parent 945a7f1 commit ac292d9

File tree

1 file changed

+147
-18
lines changed

1 file changed

+147
-18
lines changed

docs/guides/rate-limiting.md

Lines changed: 147 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -3,29 +3,50 @@ title: Rate Limiting Guide
33
id: rate-limiting
44
---
55

6-
Rate Limiting, Throttling, and Debouncing are three distinct approaches to controlling function execution frequency. Each technique blocks executions differently, making them "lossy" - meaning some function calls will not execute when they are requested to run too frequently. Understanding when to use each approach is crucial for building performant and reliable applications.
6+
Rate Limiting, Throttling, and Debouncing are three distinct approaches to controlling function execution frequency. Each technique blocks executions differently, making them "lossy" - meaning some function calls will not execute when they are requested to run too frequently. Understanding when to use each approach is crucial for building performant and reliable applications. This guide will cover the Rate Limiting concepts of TanStack Pacer.
77

88
## Rate Limiting Concept
99

10-
Rate Limiting is a technique that limits the rate at which a function can execute. It is particularly useful for scenarios where you want to prevent a function from being called too frequently, such as when handling API requests or other external service calls. It is the most *naive* approach, as it allows executions to happen in bursts until the quota is met.
10+
Rate Limiting is a technique that limits the rate at which a function can execute over a specific time window. It is particularly useful for scenarios where you want to prevent a function from being called too frequently, such as when handling API requests or other external service calls. It is the most *naive* approach, as it allows executions to happen in bursts until the quota is met.
11+
12+
### Rate Limiting Visualization
1113

1214
```text
1315
Rate Limiting (limit: 3 calls per window)
1416
Timeline: [1 second per tick]
15-
Window 1 | Window 2
16-
Calls: ↓ ↓
17-
Executed: ✓ ✓ ✓ ✕
18-
[=== 3 allowed ===][=== blocked until window ends ===][=== new window ===]
17+
Window 1 | Window 2
18+
Calls: ⬇️ ⬇️ ⬇️ ⬇️ ⬇️ ⬇️ ⬇️
19+
Executed: ✅ ✅
20+
[=== 3 allowed ===][=== blocked until window ends ===][=== new window =======]
1921
```
2022

23+
### When to Use Rate Limiting
24+
25+
Rate Limiting is particularly important when dealing with external services, API quotas, or resource-intensive operations where you need to prevent overload.
26+
27+
Common use cases include:
28+
- Enforcing hard API rate limits (e.g., limiting users to 100 requests per hour)
29+
- Managing resource constraints (e.g., database connections or external service calls)
30+
- Scenarios where bursty behavior is acceptable
31+
- Protection against DoS attacks or abuse
32+
- Implementing fair usage policies in multi-tenant systems
33+
34+
### When Not to Use Rate Limiting
35+
36+
Rate Limiting is the most naive approach to controlling function execution frequency. It is the least flexible and most restrictive of the three techniques. Consider using [throttling](../guides/throttling) or [debouncing](../guides/debouncing) instead for more spaced out executions.
37+
38+
> [!TIP]
39+
> You most likely don't want to use "rate limiting" for most use cases. Consider using [throttling](../guides/throttling) or [debouncing](../guides/debouncing) instead.
40+
41+
Rate Limiting's "lossy" nature also means that some executions will be rejected and lost. This can be a problem if you need to ensure that all executions are always successful. Consider using [queueing](../guides/queueing) if you need to ensure that all executions are queued up to be executed, but with a throttled delay to slow down the rate of execution.
42+
2143
## Rate Limiting in TanStack Pacer
2244

23-
TanStack Pacer's `rateLimit` function is a simple implementation that limits the rate at which a function can execute. It is particularly useful for scenarios where you want to prevent a function from being called too frequently, such as when handling API requests or other external service calls.
45+
TanStack Pacer provides a few ways to implement rate limiting. There is the simple `rateLimit` function for basic usage, the `RateLimiter` class for more advanced control, and each framework adapter further builds convenient hooks and functions around the `RateLimiter` class.
46+
47+
### Basic Usage with `rateLimit`
2448

25-
For example, if you set a limit of 5 calls per minute:
26-
- The first 5 calls within the minute will execute immediately
27-
- Any subsequent calls within that same minute window will be blocked
28-
- Once the minute window resets, 5 more calls can be made
49+
The `rateLimit` function is the simplest way to add rate limiting to any function. It's perfect for most use cases where you just need to enforce a simple limit.
2950

3051
```ts
3152
import { rateLimit } from '@tanstack/pacer'
@@ -51,11 +72,119 @@ rateLimitedApi('user-5') // ✅ Executes
5172
rateLimitedApi('user-6') // ❌ Rejected until window resets
5273
```
5374

54-
Rate Limiting is best suited for:
55-
- Enforcing hard API rate limits (e.g., limiting users to 100 requests per hour)
56-
- Managing resource constraints (e.g., database connections or external service calls)
57-
- Scenarios where bursty behavior is acceptable
58-
- Protection against DoS attacks or abuse
75+
### Advanced Usage with `RateLimiter` Class
5976

60-
> [!TIP]
61-
> You most likely don't want to use "rate limiting". Consider using [throttling](../guides/throttling) or [debouncing](../guides/debouncing) instead.
77+
For more complex scenarios where you need additional control over the rate limiting behavior, you can use the `RateLimiter` class directly. This gives you access to additional methods and state information.
78+
79+
```ts
80+
import { RateLimiter } from '@tanstack/pacer'
81+
82+
// Create a rate limiter instance
83+
const limiter = new RateLimiter(
84+
(id: string) => fetchUserData(id),
85+
{
86+
limit: 5,
87+
window: 60 * 1000,
88+
onReject: ({ msUntilNextWindow }) => {
89+
console.log(`Rate limit exceeded. Try again in ${msUntilNextWindow}ms`)
90+
}
91+
}
92+
)
93+
94+
// Get information about current state
95+
console.log(limiter.getRemainingInWindow()) // Number of calls remaining in current window
96+
console.log(limiter.getExecutionCount()) // Total number of successful executions
97+
console.log(limiter.getRejectionCount()) // Total number of rejected executions
98+
99+
// Attempt to execute (returns boolean indicating success)
100+
limiter.maybeExecute('user-1')
101+
102+
// Update options dynamically
103+
limiter.setOptions({ limit: 10 }) // Increase the limit
104+
105+
// Reset all counters and state
106+
limiter.reset()
107+
```
108+
109+
### Framework Adapters
110+
111+
Each framework adapter further builds convenient hooks and functions around the `RateLimiter` class. Hooks like `useRateLimitedCallback`, `useRateLimitedState`, or `useRateLimitedValue` are small wrappers around the `RateLimiter` class that can cut down on the boilerplate needed in your own code for some common use cases.
112+
113+
## Synchronous vs Asynchronous Rate Limiting
114+
115+
TanStack Pacer provides both synchronous and asynchronous rate limiting through the `RateLimiter` and `AsyncRateLimiter` classes respectively (and their corresponding `rateLimit` and `asyncRateLimit` functions). Understanding when to use each is important for proper rate limiting behavior.
116+
117+
### Synchronous Rate Limiting
118+
119+
Use the synchronous `RateLimiter` when:
120+
- Your rate-limited function is synchronous (doesn't return a Promise)
121+
- You don't need to wait for the function to complete before counting it as executed
122+
- You want immediate feedback on whether the execution was allowed or rejected
123+
124+
```ts
125+
import { rateLimit } from '@tanstack/pacer'
126+
127+
const rateLimited = rateLimit(
128+
(data: string) => processData(data),
129+
{
130+
limit: 5,
131+
window: 1000, // 1 second
132+
}
133+
)
134+
135+
// Returns true if executed, false if rejected
136+
const wasExecuted = rateLimited('some data')
137+
```
138+
139+
### Asynchronous Rate Limiting
140+
141+
Use the `AsyncRateLimiter` when:
142+
- Your rate-limited function returns a Promise
143+
- You need to handle errors from the async function
144+
- You want to ensure proper rate limiting even if the async function takes time to complete
145+
146+
```ts
147+
import { asyncRateLimit } from '@tanstack/pacer'
148+
149+
const rateLimited = asyncRateLimit(
150+
async (id: string) => {
151+
const response = await fetch(`/api/data/${id}`)
152+
return response.json()
153+
},
154+
{
155+
limit: 5,
156+
window: 1000,
157+
onError: (error) => {
158+
console.error('API call failed:', error)
159+
}
160+
}
161+
)
162+
163+
// Returns a Promise<boolean> - resolves to true if executed, false if rejected
164+
const wasExecuted = await rateLimited('123')
165+
```
166+
167+
The `AsyncRateLimiter` provides additional features specific to async functions:
168+
- Error handling through the `onError` callback
169+
- Proper async execution tracking
170+
- Returns Promises that resolve to boolean values indicating execution success
171+
172+
### Key Differences
173+
174+
1. **Return Type**:
175+
- Sync: Returns `boolean` immediately
176+
- Async: Returns `Promise<boolean>`
177+
178+
2. **Error Handling**:
179+
- Sync: No built-in error handling
180+
- Async: Supports `onError` callback for handling rejected promises
181+
182+
3. **Execution Timing**:
183+
- Sync: Counted as executed immediately
184+
- Async: Counted as executed when the Promise resolves
185+
186+
4. **Usage Pattern**:
187+
- Sync: Good for CPU-bound operations or synchronous API calls
188+
- Async: Better for I/O operations, network requests, or any Promise-based operations
189+
190+
For most use cases, the normal non-async `RateLimiter` can be sufficient, but when you need extra error handling, or you want to make sure that each execution finishes before the next one starts, then the async `AsyncRateLimiter` is for you.

0 commit comments

Comments
 (0)