-
Notifications
You must be signed in to change notification settings - Fork 18
Open
Description
The server can be quite slow when accessed by multiple simultaneous users. For instance, here are load tests for a few pages with drill.
benchmark.yml
Details
---
concurrency: 30
base: 'https://ramp.studio'
iterations: 32
rampup: 5
plan:
- name: Fetch homepage
request:
url: /
- name: Fetch event
request:
url: /events/{{ item }}
with_items:
- air_passengers_dssp_14
- air_passengers_m2mosef2020
- air_passengers_py4ds2020
- name: Fetch problems
request:
url: /problemswhere
drill --benchmark benchmark.yml --statswith 30 concurrent connections produces ,
Fetch homepage Total requests 32
Fetch homepage Median time per request 155ms
Fetch event Total requests 96
Fetch event Median time per request 109ms
Fetch problems Total requests 32
Fetch problems Median time per request 2033ms
so the issue seems to be particularly with the /problems page, I suspect because of of a DB query in a double loop here.
The solutions could be to either,
- to refactor that code to not make so many DB queries
- cache the whole page either with nginx or something like caddy + caddy-cache
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels