-
Notifications
You must be signed in to change notification settings - Fork 2k
[Laravel/ripple] Update some configurations to be compatible with the v0.6 #9300
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
In one test, I got the following results. The difference was huge, but no unexpected exceptions occurred, just extra parameters
According to the conclusion: Currently Ripple does not support the same long connection to submit another request before the end of the response If you focus only on this feature, is there a clear correlation between the following large differences in test results? Thank you for your guidance. wrk -H 'Host: tfb-server' -H 'Accept: text/plain,text/html;q=0.9,application/xhtml+xml;q=0.9,application/xml;q=0.8,*/*;q=0.7' -H 'Connection: keep-alive' --latency -d 15 -c 512 --timeout 8 -t 16 http://tfb-server:8080/plaintext
---------------------------------------------------------
Running 15s test @ http://tfb-server:8080/plaintext
16 threads and 512 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 76.46ms 23.91ms 824.06ms 78.31%
Req/Sec 409.19 73.58 633.00 78.35%
Latency Distribution
50% 75.06ms
75% 87.18ms
90% 100.04ms
99% 132.38ms
97821 requests in 15.09s, 19.03MB read
Requests/sec: 6482.30
Transfer/sec: 1.26MB
---------------------------------------------------------
Concurrency: 256 for plaintext
wrk -H 'Host: tfb-server' -H 'Accept: text/plain,text/html;q=0.9,application/xhtml+xml;q=0.9,application/xml;q=0.8,*/*;q=0.7' -H 'Connection: keep-alive' --latency -d 15 -c 256 --timeout 8 -t 16 http://tfb-server:8080/plaintext -s pipeline.lua -- 16
---------------------------------------------------------
Running 15s test @ http://tfb-server:8080/plaintext
16 threads and 256 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 0.00us 0.00us 0.00us -nan%
Req/Sec 157.38 3.05 160.00 87.50%
Latency Distribution
50% 0.00us
75% 0.00us
90% 0.00us
99% 0.00us
256 requests in 15.06s, 51.00KB read
Requests/sec: 17.00
Transfer/sec: 3.39KB
STARTTIME 1727616870
ENDTIME 1727616885 |
|
In the last test you don't only added So it's normal to have less req/s in the last test, less concurrency = less req/s. Test it again with the same concurrency. |
|
Thank you for your concern, the test is consistent with my inference, but the focus in the short term is not to solve the pipeline.lua support Do you have any other suggestions for this commit? |
|
I did that, currently it follows a uniform code. maybe the |
|
No, lose the '.env 'configuration and its listening address will be 127.0.0.1 instead of 0.0.0.0, so I'll do better next time, excuse me, it urgently needs to use the latest way |
Update some configurations to be compatible with the v0.6 core, this version addresses some of the exceptions that occurred earlier.