-
-
Notifications
You must be signed in to change notification settings - Fork 189
Description
Hi, has anyone done or thought of solving 'dogpiling' somehow with fastapi-cache? The dogpile.cache lib describes the problem nicely, https://dogpilecache.sqlalchemy.org/en/latest/
Briefly, in the case of an empty initial cache, it's when multiple requests come for the same resource, so that processing one is started, but not complete yet. So it's not in the cache, and new requests start fetching the same resource in parallel. It would be better for the later requests to wait for the completion of the first operation, and then return the cached result.
The dogpile.caching lib is not async, but the aiocache lib is, and it has an OptimisticLock mechanism that would support this use case: https://aiocache.aio-libs.org/en/latest/locking.html#aiocache.lock.OptimisticLock
Their tests include test_locking_dogpile, using RedLock, https://github.com/aio-libs/aiocache/blob/master/tests/acceptance/test_lock.py#L55
I'm thinking of trying that out in a fastapi-cache cache decorator, so am just curious if people have thought of this or have some solutions out there maybe.