#26 Implemented leaky bucket rate limiter #102
#26 Implemented leaky bucket rate limiter #102kevkevy3000 wants to merge 1 commit intoroma-glushko:mainfrom
Conversation
kevkevy3000
commented
Dec 10, 2023
- Added in new leaky bucket rate limiting functionality
- Added in new test cases for the leaky bucket rate limiting component
| class LeakyBucket: | ||
| """ | ||
| Leaky Bucket Logic | ||
| Leak tokens as time passes on. If there is space in the bucket, executions can be allowed. |
There was a problem hiding this comment.
@kevkevy3000 according to my understanding this implementation looks like token bucket rather than leaky bucket.
In leaky bucket, consumers are collected in a bucket as long as bucket size allows (and get rejected if not) In the meanwhile, as time goes on, we let some of these waiting consumers to execute. So in some way, it's more complicated flow than token bucket.
I would imagine this being implemented in a similar way to asyncio.Queue.
Concretely, we maintain a queue of waiters. We add a consumer to the queue on attempt to "take" a token. It seems like we also need to have a separate task that will wake up waiters as time goes on.
This way we could really smooth incoming requests.
Does this make sense?