Throttling for parallel queries #4121
Unanswered
williamjulianvicary
asked this question in
Ideas
Replies: 2 comments 2 replies
-
no, I don't think it's possible. You would need to do this on the actual api layer that makes the requests. |
Beta Was this translation helpful? Give feedback.
2 replies
-
Tanstack now has a new Pacer library which provides Async Queues. Here's an example of using Queue Pacer with Query - https://tanstack.com/pacer/latest/docs/framework/react/examples/react-query-queued-prefetch. However, I couldn't get how this really works with Query, so I gave some thought and I really happy with what I came up with! function useTokenDetails(addresses: string[]) {
return useQueries({
queries: addresses.map((address, i) => ({
queryKey: ['token-details', address],
queryFn: async () => {
// simulate batched requests
const concurrency = 2;
const delay = Math.floor(i / concurrency) * 1000;
console.debug('Queued', { address, delay });
await new Promise(resolve => setTimeout(resolve, delay));
console.debug('Fetched', { address, delay });
// Perform the query
console.debug('Fetching token details for:', address);
return await getTokenDetails(address);
},
})),
});
} |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I have a dashboard that loads up-to many hundreds of rows of data, once loaded these rows then need to be hydrated via an API request.
I'd like to throttle the requests to the endpoint to N concurrent requests, i.e, say, 3 at a time as the endpoint this hits actually makes a backend request to other endpoints which impose rate limits.
I don't believe this is currently possible, right?
Beta Was this translation helpful? Give feedback.
All reactions