Issue with Concurrent Requests Handling in Flowise with Azure OpenAI #2632
Replies: 1 comment
-
Have you resolved this question? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello,
I am currently using Azure OpenAI to implement Chat Azure OpenAI. However, it seems that Flowise is unable to handle concurrent requests from clients. When clients send multiple requests simultaneously, Flowise does not appear to process these requests in parallel. The logs in Langfuse suggest that concurrent requests are not being handled properly.
Is there a recommended configuration or method to resolve this issue? How can I ensure that Flowise can handle concurrent requests effectively?
Thank you.
Beta Was this translation helpful? Give feedback.
All reactions