Replies: 8 comments
-
Hi, yesterday I conducted load testing of 2x wsgi, unfortunately, waitress defeated fastwsgi :) |
Beta Was this translation helpful? Give feedback.
-
fatwsgi could not withstand the high load |
Beta Was this translation helpful? Give feedback.
-
I don't know where to write, so I wrote here)) |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
The How many threads (processes) were used for the |
Beta Was this translation helpful? Give feedback.
-
apparently I was running in 1 thread, how can I run in multiple threads? |
Beta Was this translation helpful? Give feedback.
-
Under Windows, it can now only be used in single-process mode (1 thread). Therefore, for comparison, it is worth |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Client : i7-980X @ 2.8GHz, Debian 12, Python 3.10, NIC Intel x550-t2 10Gbps
Server: i7-980X @ 2.8GHz, Windows 7, Python 3.8, NIC Intel x550-t2 10Gbps
Payload for testing: https://github.com/MiloszKrajewski/SilesiaCorpus/blob/master/xml.zip (651 KiB)
Server test app: https://gist.github.com/remittor/1f2bc834852009631d437cd96822afa4
FastWSGI + Flask
python.exe server.py -h 172.16.220.205 -g fw -f xml.zip -b
nginx.exe
Werkzeug + Flask
python.exe server.py -h 172.16.220.205 -g wz -f xml.zip -b
Waitress + Flask
python.exe server.py -h 172.16.220.205 -g wr -f xml.zip -b
Beta Was this translation helpful? Give feedback.
All reactions