This repository contains simple HTTP servers written in Go, PHP, and Node.js. The goal is to compare the performance of these servers by benchmarking them using wrk.
Use the instructions below to run and benchmark each server.
- Install Go (ideally Go 1.20 or later)
- Install PHP (ideally PHP 8.2 or later)
- Install Node.js (ideally Node.js 20 or later)
- Install OpenSwoole (Needed to run the Swoole server)
- Install
wrkfor benchmarking
On Linux, you can install wrk with the following commands:
sudo apt-get update
sudo apt-get install build-essential libssl-dev git -y
git clone https://github.com/wg/wrk.git
cd wrk
make
sudo cp wrk /usr/local/binNavigate to the go directory and run the following command:
go run main.goThe server will start on http://localhost:8080.
Navigate to the php directory and run the following command:
php -S localhost:8080The server will start on http://localhost:8080.
Navigate to the node directory and run the following command:
node server.jsThe server will start on http://localhost:8080.
Navigate to the swoole directory and run the following command:
php -d extension=swoole.so index.phpThe server will start on http://localhost:8080.
Use wrk to benchmark the servers. Open a new terminal window and run the following command:
wrk -t12 -c400 -d30s http://localhost:8080This command will run wrk with:
-t12: 12 threads-c400: 400 connections-d30s: for a duration of 30 seconds
Adjust the parameters as needed to better understand the server’s capabilities under different loads.
The output will show statistics like requests per second, latency, and more. Example:
Running 30s test @ http://localhost:8080
12 threads and 400 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 10.09ms 4.95ms 45.79ms 65.22%
Req/Sec 3.33k 1.33k 8.18k 69.19%
1194183 requests in 30.10s, 152.23MB read
Requests/sec: 39694.77
Transfer/sec: 5.06MB
- Ensure that only one server is running at a time on port 8080 to avoid conflicts.
- Adjust the parameters of the
wrkcommand as needed to better understand the server’s capabilities under different loads.
Feel free to fork the project, make optimizations, and submit pull requests with any enhancements or bug fixes.