Socket IO - Connection Error (When I make more then 100 concurrent connections) #1942
Replies: 2 comments 8 replies
-
100 seems a little low, I can't remember what limits I hit in my load testing from over a year ago, I was testing my application and not just connection. I was thinking it was more like 200 for me. Anyway I always hit some other issue after increasing max open file limit. I got egg on my face suggesting to just up the max files limit, after testing I realized that was never going to work. I do load balancing with session affinity using multiple smaller machines instead of a large one. I however do not need to communicate between all the connections. I can either smartly group them with the load balancer (So small group of connections that need to talk to each other all go to the same instance) or they just do their own things and don't communicate with each other. If you need the communication, Flask-SocketIO can utilize something like redis. |
Beta Was this translation helpful? Give feedback.
-
Flask-SocketIO does not have any limits. I suggest you review your system logs to help you determine what's causing these failures. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I am currently testing my flask socket io server which is deployed on ec2 using gunicorn, gevent and nginx
Here are the configurations for the same
gunicorn --threads 1 --worker-class geventwebsocket.gunicorn.workers.GeventWebSocketWorker -w 1 application:app -b 0.0.0.0:5000 --worker-connections 8000
Nginx config is this
I am using gevent for handling async requests -> application.py
Here are some events that are written in the code on which i am testing
connect(): -> checking user authorization
punchEvent(): -> call psql db and do some action -> basically adding data into aws sqs
Now client side script
Now I have a AWS test lambda function with 1000 concurrency attached to a SQS. Now inserting 800 objects into queue and when lambda function triggers them then it was able to successfully able to create x connections but rest gives Connection Error
I also tried to increase max open file limit.
The above scripts works fine when i make single connection at a time in a loop.
I thought nginx is the bottleneck but i tried the same on 5000 port without nginx but the issue was same
Please suggest anything So i can hit at least 5k concurrent requests in parallel
Beta Was this translation helpful? Give feedback.
All reactions