Skip to content

Commit 3aba6c3

Browse files
docs(srv): update doc on container concurrency MTA-5283 (#4002)
* docs(srv): update doc on container concurrency MTA-5283 * Update serverless/containers/reference-content/containers-concurrency.mdx Co-authored-by: nerda-codes <[email protected]> * fix(srv): update --------- Co-authored-by: nerda-codes <[email protected]>
1 parent 08de437 commit 3aba6c3

File tree

2 files changed

+56
-4
lines changed

2 files changed

+56
-4
lines changed

serverless/containers/concepts.mdx

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -27,6 +27,8 @@ Startup process steps are:
2727

2828
Concurrency defines the number of simultaneous requests a single instance of your container can handle at the same time. Once the number of incoming requests exceeds this value, your container scales according to your parameters.
2929

30+
Refer to the [dedicated documentation](/serverless/containers/reference-content/containers-concurrency/) for more information on container concurrency.
31+
3032
## Container
3133

3234
A container is a package of software that includes all dependencies: code, runtime, configuration, and system libraries so that it can run on any host system. Scaleway provides custom Docker images that are entirely handled for you in the cloud. With Containers, you can rely on your favorite technologies such as Django, or Ruby on Rails.

serverless/containers/reference-content/containers-concurrency.mdx

Lines changed: 54 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -16,18 +16,68 @@ categories:
1616

1717
## Concurrency overview
1818

19-
Concurrency determines the number of incoming requests a single instance of a container can process before scaling up. Serverless Containers autoscale according to the number of instances needed to handle the incoming workload.
19+
Concurrency determines the number of incoming requests a single instance of a container can process before scaling up. Serverless Containers automatically scale according to the number of instances needed to handle the incoming workload.
2020

2121
A higher number of instances processing requests at the same time implies a greater usage of memory and [vCPU](/serverless/containers/concepts/#vcpu), and consequently a higher cost.
2222

23-
## Maximum concurrent requests per instance
23+
```
24+
Concurrency = 1
25+
26+
┌─────────────┐
27+
│ ┌──────┐│
28+
│ │ │ ││
29+
└─────┼───► │ ││ Concurrency = 80
30+
│ │ ││
31+
│ └──────┘│ ┌─────────────┐
32+
│ ┌──────┐│ │ │ ┌──────┐│
33+
│ │ ││ └────┼───► │ ││
34+
3 requests ──────┼───► │ ││ 3 requests ─────┼───► │ ││
35+
│ │ ││ ┌────┼───► │ ││
36+
│ └──────┘│ │ │ └──────┘│
37+
│ ┌──────┐│ └─────────────┘
38+
│ │ ││ 1 service
39+
┌─────┼───► │ ││ 1 container instance
40+
│ │ │ ││
41+
│ └──────┘│
42+
└─────────────┘
43+
1 service
44+
3 container instances
45+
```
2446

25-
When [deploying a container](/serverless/containers/how-to/deploy-container/), Scaleway Serverless Containers allows you to edit the **Maximum concurrent requests per instance** parameter.
47+
## Concurrency setting
2648

27-
Serverless Containers apply the highest possible value by default: **80 concurrent requests**. We recommend using the default value, as Serverless Containers are designed to efficiently function with it, but you can lower it to better fit specific requirements.
49+
When [deploying a container](/serverless/containers/how-to/deploy-container/), Scaleway Serverless Containers allows you to configure the [concurrency](/serverless/containers/concepts/#concurrency), which is the maximum number of requests a single container instance can handle at the same time. The concurrency setting ranges from 1 to 1,000 simultaneous instances.
50+
51+
By default, Serverless Containers allow **80 concurrent requests** per container instance. We recommend using the default value, as Serverless Containers are designed to efficiently function with it, but you can lower it to better fit specific requirements.
52+
53+
## Impact of concurrency on container scaling
54+
55+
If the number of incoming requests exceeds the set concurrency value for a container, Serverless Containers automatically creates new container instances to handle the additional load.
56+
57+
For example, if a container is set to handle 10 concurrent requests, and 50 requests or events arrive at the same time, Serverless Containers will create at least 5 container instances to handle the incoming load.
58+
59+
## Benefits of concurrency
60+
61+
**Efficiency**: Setting a higher concurrency value allows a single container instance to handle more requests, which can reduce the total number of container instances, therefore reducing costs.
62+
63+
**Responsiveness**: Setting a lower concurrency (down to 1 concurrent container instance) allows to reduce latency for individual requests, since the container focuses on fewer tasks at a time.
64+
65+
## Implications for your service
66+
67+
**Compute-bound workloads**: Workloads relying on heavy CPU or memory usage per request benefit from lower concurrency (e.g., 1 or 2) to avoid resource contention within the container.
68+
69+
**I/O-bound workloads**: Workloads relying on smaller input/output operations, such as waiting for external APIs or databases benefit from higher concurrency, since the container can handle multiple requests while waiting on I/O.
2870

2971
## Examples
3072

73+
### Low concurrency
74+
3175
If your Serverless Container hosts a resource-intensive application that can only handle a small number of requests with the provisioned memory and vCPU, you can reduce the number of maximum concurrent requests per instance to scale quicker and avoid bottlenecks and queued requests.
3276

3377
You can set the **Maximum concurrent requests per instance** of your container to `1` if it is designed to handle a single request at a time. However, a low maximum concurrency value may affect the performance of your deployment, as several instances of your container will start if it receives a high number of concurrent requests.
78+
79+
### High concurrency
80+
81+
If your Serverless Container hosts an application that receives a high number of requests that are easily processed with the provisioned memory and vCPU, you can set a greater number of maximum concurrent requests per instance.
82+
83+
By setting a greater concurrency value than the average number of concurrent requests your container can handle, it will only scale up in the event of a spike in received requests.

0 commit comments

Comments
 (0)