Skip to content

Benchmark: Socketify vs FastWSGI #110

@remittor

Description

@remittor

Machine: AMD EPYC 7543 @ 3.7GHz, Debian 11, Python 3.9

Payload for testing: https://github.com/MiloszKrajewski/SilesiaCorpus/blob/master/xml.zip (651 KiB)

Server test app: https://gist.github.com/remittor/c9411e62b5ea4776200bee288a331016

FastWSGI project: https://github.com/jamesroberts/fastwsgi


Socketify (multi-threaded)

> python3 server.py -g si -t8

> wrk -t1 -c1 -d15 http://127.0.0.1:5000
Running 15s test @ http://127.0.0.1:5000
  1 threads and 1 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency    49.60us   61.19us   4.78ms   99.30%
    Req/Sec    20.79k     2.32k   26.18k    78.15%
  312288 requests in 15.10s, 46.76MB read
Requests/sec:  20681.11
Transfer/sec:      3.10MB
> wrk -t8 -c8 -d15 http://127.0.0.1:5000
Running 15s test @ http://127.0.0.1:5000
  8 threads and 8 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency    61.00us   64.76us   5.04ms   98.97%
    Req/Sec    16.78k     3.24k   20.77k    70.45%
  2016690 requests in 15.10s, 301.95MB read
Requests/sec: 133554.59
Transfer/sec:     20.00MB
> wrk -t8 -c128 -d15 http://127.0.0.1:5000
Running 15s test @ http://127.0.0.1:5000
  8 threads and 128 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   454.64us  109.56us   3.29ms   72.81%
    Req/Sec    35.11k     1.94k   42.48k    67.61%
  4216557 requests in 15.10s, 631.33MB read
Requests/sec: 279246.40
Transfer/sec:     41.81MB

> python3 server.py -g si -t8 -f xml.zip -b

> wrk -t1 -c1 -d15 http://127.0.0.1:5000
Running 15s test @ http://127.0.0.1:5000
  1 threads and 1 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   230.31us  458.91us  15.63ms   99.34%
    Req/Sec     4.95k     1.12k    6.28k    56.29%
  74349 requests in 15.10s, 46.20GB read
Requests/sec:   4923.96
Transfer/sec:      3.06GB
> wrk -t8 -c8 -d15 http://127.0.0.1:5000
Running 15s test @ http://127.0.0.1:5000
  8 threads and 8 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   534.77us  445.86us  11.07ms   93.31%
    Req/Sec     1.98k   764.95     3.56k    61.92%
  238315 requests in 15.10s, 148.09GB read
Requests/sec:  15782.65
Transfer/sec:      9.81GB

Socketify (single-threaded)

> python3 server.py -g si -t1

> wrk -t1 -c1 -d15 http://127.0.0.1:5000
Running 15s test @ http://127.0.0.1:5000
  1 threads and 1 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency    54.19us   58.65us   4.83ms   99.43%
    Req/Sec    18.85k   763.04    19.83k    84.00%
  281228 requests in 15.00s, 42.11MB read
Requests/sec:  18748.14
Transfer/sec:      2.81MB
> wrk -t8 -c8 -d15 http://127.0.0.1:5000
Running 15s test @ http://127.0.0.1:5000
  8 threads and 8 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   281.61us   73.64us   4.76ms   92.65%
    Req/Sec     3.57k   320.89     6.26k    82.82%
  427728 requests in 15.10s, 64.04MB read
Requests/sec:  28326.63
Transfer/sec:      4.24MB
> wrk -t8 -c128 -d15 http://127.0.0.1:5000
Running 15s test @ http://127.0.0.1:5000
  8 threads and 128 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     3.98ms  239.06us   8.56ms   87.62%
    Req/Sec     4.04k   114.86     4.36k    59.17%
  482031 requests in 15.01s, 72.17MB read
Requests/sec:  32119.73
Transfer/sec:      4.81MB

> python3 server.py -g si -t1 -f xml.zip -b

> wrk -t1 -c1 -d15 http://127.0.0.1:5000
Running 15s test @ http://127.0.0.1:5000
  1 threads and 1 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   342.26us    1.07ms  23.18ms   98.60%
    Req/Sec     4.26k     1.53k    6.32k    50.99%
  63969 requests in 15.10s, 39.75GB read
Requests/sec:   4236.57
Transfer/sec:      2.63GB
> wrk -t8 -c8 -d15 http://127.0.0.1:5000
Running 15s test @ http://127.0.0.1:5000
  8 threads and 8 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     2.11ms  452.35us   7.02ms   89.65%
    Req/Sec   474.80     68.24   626.00     74.58%
  56754 requests in 15.01s, 35.27GB read
Requests/sec:   3781.08
Transfer/sec:      2.35GB

FastWSGI (single-threaded)

> python3 server.py -g fw -t1

> wrk -t1 -c1 -d15 http://127.0.0.1:5000
Running 15s test @ http://127.0.0.1:5000
  1 threads and 1 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency    53.38us  208.03us  10.83ms   99.68%
    Req/Sec    21.72k     0.88k   22.89k    89.40%
  326314 requests in 15.10s, 31.74MB read
Requests/sec:  21610.83
Transfer/sec:      2.10MB
> wrk -t8 -c8 -d15 http://127.0.0.1:5000
Running 15s test @ http://127.0.0.1:5000
  8 threads and 8 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   617.78us    5.29ms  94.86ms   98.96%
    Req/Sec     7.39k     1.41k    9.34k    75.91%
  888591 requests in 15.10s, 86.44MB read
Requests/sec:  58847.30
Transfer/sec:      5.72MB
> wrk -t8 -c128 -d15 http://127.0.0.1:5000
Running 15s test @ http://127.0.0.1:5000
  8 threads and 128 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     2.08ms    3.44ms  66.57ms   98.49%
    Req/Sec     9.19k     1.48k   38.75k    94.01%
  1098912 requests in 15.10s, 106.90MB read
Requests/sec:  72777.04
Transfer/sec:      7.08MB

> python3 server.py -g fw -t1 -f xml.zip -b

> wrk -t1 -c1 -d15 http://127.0.0.1:5000
Running 15s test @ http://127.0.0.1:5000
  1 threads and 1 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   180.08us  226.19us   9.09ms   99.12%
    Req/Sec     5.94k     1.23k    7.08k    81.46%
  89346 requests in 15.10s, 55.52GB read
Requests/sec:   5917.17
Transfer/sec:      3.68GB
> wrk -t8 -c8 -d15 http://127.0.0.1:5000
Running 15s test @ http://127.0.0.1:5000
  8 threads and 8 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.50ms  793.92us  46.30ms   86.38%
    Req/Sec   676.81    155.80     0.93k    75.75%
  80895 requests in 15.02s, 50.27GB read
Requests/sec:   5387.46
Transfer/sec:      3.35GB

Conclusion:

Project socketify is very fast when using multithreading. But in single-threaded mode, server FastWSGI turned out to be faster.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions