Skip to content
69 changes: 68 additions & 1 deletion faq/serverless-containers.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,49 @@ category: serverless
productIcon: ContainersProductIcon
---

## What is serverless computing and how does it differ from traditional cloud hosting?

Serverless computing is a cloud execution model where the cloud provider dynamically manages the allocation of compute resources. Unlike traditional hosting models, you don’t need to provision, scale, or maintain servers. Instead, you focus solely on writing and deploying your code, and the infrastructure scales automatically to meet demand.

## Why consider using Serverless Containers, Functions or Jobs for my projects?

These services allow you to build highly scalable, event-driven, and pay-as-you-go solutions. Serverless Containers and Functions help you create applications and microservices without worrying about server management, while Serverless Jobs let you run large-scale, parallel batch processing tasks efficiently. This can lead to faster development cycles, reduced operational overhead, and cost savings.

## Can I run any application on Serverless Containers?

Yes. Because Serverless Containers supports any containerized application, you can choose the language, runtime, and framework that best suits your needs. As long as it can run in a container and respond to HTTP requests, Serverless Containers can host it.

## What are the cost benefits of using serverless services like Serverless Containers?

With serverless, you only pay for the compute resources you actually use. There are no upfront provisioning costs or paying for idle capacity. When your application traffic is low, the cost scales down, and when traffic spikes, the platform automatically scales up, ensuring you never overpay for unused resources.

## Does updates of Serverless Containers can cause downtime?

No, when deploying a new version of your Serverless Container generates a **rolling update**. This means that a new version of the service is gradually
rolled out to your users without downtime. Here is how it works:

* When a new version of your container is deployed, the platform automatically starts routing traffic to the new version incrementally, while still serving requests from the old version until the new one is fully deployed.
* Once the new version is successfully running, we gradually shifts all traffic to it, ensuring zero downtime.
* The old version is decommissioned once the new version is fully serving traffic.

This process ensures a seamless update experience, minimizing disruption to users during deployments. If needed, you can also manage traffic splitting between versions during the update process, allowing you to test new versions with a subset of traffic before fully migrating to it.

## Can I update Serverless Container ressources (vCPU and RAM) at any time?

Yes, Serverless Containers ressources can be changed at any time without causing downtime, see previous question.

## How does scaling work in these serverless services?

Scaling in Serverless Containers and Serverless Functions is handled automatically by the platform. When demand increases—more requests or events—the platform spins up additional instances to handle the load. When demand decreases, instances spin down. This ensures optimal performance without manual intervention.

## How do I integrate my serverless solutions with other Scaleway services?

Integration is straightforward. Serverless Functions and Containers can be triggered by events from Queues, SQS and SNS and can easily communicate with services like Managed databases or Serverless databases. Serverless Jobs can pull data from Object Storage or output processed results into a database. With managed connectors, APIs, and built-in integrations, linking to the broader Scaleway ecosystem is seamless.

## Can I migrate existing applications to Serverless Containers?

Yes. Many traditional applications can be containerized and deployed to Serverless Containers. This makes it easier to modernize legacy systems without a complete rewrite. By moving to a serverless platform, you gain automatic scaling, reduced operational overhead, and a simpler infrastructure management experience.

## Are applications deployed on Serverless Containers stateless?

Yes, all applications deployed on Serverless Containers are stateless. This means the server does not store any state about the client session. Instead, the session data is stored on the client and passed to the server as needed.
Expand Down Expand Up @@ -143,4 +186,28 @@ solutions like Scaleway Object Storage.
## Why does my container have an instance running after deployment, even with min-scale 0?

Currently, a new container instance will always start after each deployment, even if there is no traffic and the minimum
scale is set to 0. This behavior is not configurable at this time.
scale is set to 0. This behavior is not configurable at this time.

## How can I store data in my Serverless ressource?

Serverless ressources are by default [stateless](/serverless/containers/concepts/#stateless), local storage is ephemeral.

In different use cases like saving results of analysis, exporting data etc... it can be important to save data. Serverless ressources can be connected to ressources from the Scaleway ecosystem.

### Databases

* **Serverless Databases**: Go full serverless and take the complexity out of PostgreSQL database operations.
* **Managed MySQL / PostgreSQL**: Ensure scalability of your infrastructure and storage with our new generation of Managed Databases designed to scale on-demand according to your needs.
* **Managed Database for Redis®**: Fully managed Redis®* in seconds.
* **Managed MongoDB®**: Get the best of MongoDB® and Scaleway in one database.

### Storage

* **Object storage**: Multi-AZ resilient object storage service ensuring high availability for your data.
* **Scaleway Glacier**: Our outstanding Cold Storage class to secure long-term object storage. Ideal for deep archived data.

<Message type="tip">
Explore all Scaleway products on the console and select the right product for your use case.

Some products are not listed but for example, on specific use cases Secret Manager can help you to store informations that requires versionning.
</Message>
59 changes: 59 additions & 0 deletions faq/serverless-functions.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,41 @@ category: serverless
productIcon: FunctionsProductIcon
---

## What is serverless computing and how does it differ from traditional cloud hosting?

Serverless computing is a cloud execution model where the cloud provider dynamically manages the allocation of compute resources. Unlike traditional hosting models, you don’t need to provision, scale, or maintain servers. Instead, you focus solely on writing and deploying your code, and the infrastructure scales automatically to meet demand.

## Why consider using Serverless Containers, Functions or Jobs for my projects?

These services allow you to build highly scalable, event-driven, and pay-as-you-go solutions. Serverless Containers and Functions help you create applications and microservices without worrying about server management, while Serverless Jobs let you run large-scale, parallel batch processing tasks efficiently. This can lead to faster development cycles, reduced operational overhead, and cost savings.

## What are the cost benefits of using serverless services like Serverless Containers and Serverless Functions?

With serverless, you only pay for the compute resources you actually use. There are no upfront provisioning costs or paying for idle capacity. When your application traffic is low, the cost scales down, and when traffic spikes, the platform automatically scales up, ensuring you never overpay for unused resources.

## How does scaling work in these serverless services?

Scaling in Serverless Containers and Serverless Functions is handled automatically by the platform. When demand increases—more requests or events—the platform spins up additional instances to handle the load. When demand decreases, instances spin down. This ensures optimal performance without manual intervention.

## Does updates of Serverless Function can cause downtime?

No, when deploying a new version of your Serverless Function generates a **rolling update**. This means that a new version of the service is gradually
rolled out to your users without downtime. Here is how it works:

* When a new version of your function is deployed, the platform automatically starts routing traffic to the new version incrementally, while still serving requests from the old version until the new one is fully deployed.
* Once the new version is successfully running, we gradually shifts all traffic to it, ensuring zero downtime.
* The old version is decommissioned once the new version is fully serving traffic.

This process ensures a seamless update experience, minimizing disruption to users during deployments. If needed, you can also manage traffic splitting between versions during the update process, allowing you to test new versions with a subset of traffic before fully migrating to it.

## Can I update Serverless Function ressources (vCPU and RAM) at any time?

Yes, Serverless Functions ressources can be changed at any time without causing downtime, see previous question.

## How do I integrate my serverless solutions with other Scaleway services?

Integration is straightforward. Serverless Functions and Containers can be triggered by events from Queues, SQS and SNS and can easily communicate with services like Managed databases or Serverless databases. Serverless Jobs can pull data from Object Storage or output processed results into a database. With managed connectors, APIs, and built-in integrations, linking to the broader Scaleway ecosystem is seamless.

## How am I billed for Serverless Functions?

### Principle
Expand Down Expand Up @@ -184,3 +219,27 @@ solutions like Scaleway Object Storage.

Currently, a new function instance will always start after each deployment, even if there is no traffic and the minimum
scale is set to 0. This behavior is not configurable at this time.

## How can I store data in my Serverless ressource?

Serverless ressources are by default [stateless](/serverless/functions/concepts/#stateless), local storage is ephemeral.

In different use cases like saving results of analysis, exporting data etc... it can be important to save data. Serverless ressources can be connected to ressources from the Scaleway ecosystem.

### Databases

* **Serverless Databases**: Go full serverless and take the complexity out of PostgreSQL database operations.
* **Managed MySQL / PostgreSQL**: Ensure scalability of your infrastructure and storage with our new generation of Managed Databases designed to scale on-demand according to your needs.
* **Managed Database for Redis®**: Fully managed Redis®* in seconds.
* **Managed MongoDB®**: Get the best of MongoDB® and Scaleway in one database.

### Storage

* **Object storage**: Multi-AZ resilient object storage service ensuring high availability for your data.
* **Scaleway Glacier**: Our outstanding Cold Storage class to secure long-term object storage. Ideal for deep archived data.

<Message type="tip">
Explore all Scaleway products on the console and select the right product for your use case.

Some products are not listed but for example, on specific use cases Secret Manager can help you to store informations that requires versionning.
</Message>
46 changes: 44 additions & 2 deletions faq/serverless-jobs.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -10,9 +10,27 @@ category: serverless
productIcon: ServerlessJobsProductIcon
---

## What are Serverless Jobs?
## What is serverless computing and how does it differ from traditional cloud hosting?

Scaleway Serverless Jobs is a fully managed service that enables efficient execution of batch computing workloads. It automates management tasks, allowing users to run large-scale batch jobs with ease.
Serverless computing is a cloud execution model where the cloud provider dynamically manages the allocation of compute resources. Unlike traditional hosting models, you don’t need to provision, scale, or maintain servers. Instead, you focus solely on writing and deploying your code, and the infrastructure scales automatically to meet demand.

## Why consider using Serverless Containers, Functions or Jobs for my projects?

These services allow you to build highly scalable, event-driven, and pay-as-you-go solutions. Serverless Containers and Functions help you create applications and microservices without worrying about server management, while Serverless Jobs let you run large-scale, parallel batch processing tasks efficiently. This can lead to faster development cycles, reduced operational overhead, and cost savings.

## What is Serverless Jobs, and when should I use it?

Serverless Jobs allows you to run large-scale batch processing and computational workloads in a fully managed environment. If you have tasks like data processing, machine learning training jobs, simulations, or large-scale analytics that can be parallelized, Serverless Jobs helps you orchestrate and manage those workloads seamlessly.

## How do I integrate my serverless solutions with other Scaleway services?

Integration is straightforward. Serverless Functions and Containers can be triggered by events from Queues, SQS and SNS and can easily communicate with services like Managed databases or Serverless databases. Serverless Jobs can pull data from Object Storage or output processed results into a database. With managed connectors, APIs, and built-in integrations, linking to the broader Scaleway ecosystem is seamless.

## Can I update Serverless Jobs ressources (vCPU and RAM) at any time?

Yes, ressources of your Job Definition can be updated at any time.

Ongoing Job Runs will remain using the ressources definied at the time it started.

## How am I billed for Serverless Jobs?

Expand Down Expand Up @@ -124,3 +142,27 @@ To add network restrictions on your resource, consult the [list of prefixes used
## Can I securely use sensitive information with Serverless Jobs?

Yes, you can use sensitive data such as API secret keys, passwords, TLS/SSL certificates, or tokens. Serverless Jobs seamlessly integrates with [Secret Manager](/identity-and-access-management/secret-manager/), which allows you to securely reference sensitive information within your jobs. Refer to the [dedicated documentation](/serverless/jobs/how-to/reference-secret-in-job/) for more information.

## How can I store data in my Serverless ressource?

Serverless ressources are by default [stateless](/serverless/functions/concepts/#stateless), local storage is ephemeral.

In different use cases like saving results of analysis, exporting data etc... it can be important to save data. Serverless ressources can be connected to ressources from the Scaleway ecosystem.

### Databases

* **Serverless Databases**: Go full serverless and take the complexity out of PostgreSQL database operations.
* **Managed MySQL / PostgreSQL**: Ensure scalability of your infrastructure and storage with our new generation of Managed Databases designed to scale on-demand according to your needs.
* **Managed Database for Redis®**: Fully managed Redis®* in seconds.
* **Managed MongoDB®**: Get the best of MongoDB® and Scaleway in one database.

### Storage

* **Object storage**: Multi-AZ resilient object storage service ensuring high availability for your data.
* **Scaleway Glacier**: Our outstanding Cold Storage class to secure long-term object storage. Ideal for deep archived data.

<Message type="tip">
Explore all Scaleway products on the console and select the right product for your use case.

Some products are not listed but for example, on specific use cases Secret Manager can help you to store informations that requires versionning.
</Message>
Loading
Loading