Skip to content

Commit f889980

Browse files
authored
docs(mif): added guideflow and updated some pages (#5302)
1 parent f68bf2b commit f889980

File tree

3 files changed

+48
-35
lines changed

3 files changed

+48
-35
lines changed
Lines changed: 12 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,9 @@
11
---
22
title: How to deploy a model on Scaleway Managed Inference
3-
description: This page explains how to deploy a model on Scaleway Managed Inference
4-
tags: managed-inference ai-data creating dedicated
3+
description: This page explains how to deploy a Managed Inference model on the Scaleway console.
4+
tags: managed-inference ai-data
55
dates:
6-
validation: 2025-04-09
6+
validation: 2025-07-21
77
posted: 2024-03-06
88
---
99
import Requirements from '@macros/iam/requirements.mdx'
@@ -23,24 +23,24 @@ import Requirements from '@macros/iam/requirements.mdx'
2323
Scaleway Managed Inference allows you to deploy various AI models, either from the Scaleway catalog or by importing a custom model. For detailed information about supported models, visit our [Supported models in Managed Inference](/managed-inference/reference-content/supported-models/) documentation.
2424
</Message>
2525
<Message type="note">
26-
Some models may require acceptance of an end-user license agreement. If prompted, review the terms and conditions and accept the license accordingly.
26+
Some models may require acceptance of an end-user license agreement (EULA). If prompted, review the terms and conditions and accept the license accordingly.
2727
</Message>
2828
- Choose the geographical **region** for the deployment.
2929
- For custom models: Choose the model quantization.
3030
<Message type="tip">
3131
Each model comes with a default quantization. Select lower bits quantization to improve performance and enable the model to run on smaller GPU nodes, while potentially reducing precision.
3232
</Message>
33-
- Specify the GPU Instance type to be used with your deployment.
34-
5. Choose the number of nodes for your deployment. Note that this feature is currently in [Public Beta](https://www.scaleway.com/betas/).
35-
<Message type="note">
36-
High availability is only guaranteed with two or more nodes.
37-
</Message>
38-
6. Enter a **name** for the deployment, and optional tags.
39-
7. Configure the **network connectivity** settings for the deployment:
33+
- Select a node type, the GPU Instance that will be used with your deployment.
34+
- Choose the number of nodes for your deployment. Note that this feature is currently in [Public Beta](https://www.scaleway.com/betas/).
35+
<Message type="tip">
36+
High availability is only guaranteed with two or more nodes.
37+
</Message>
38+
5. Enter a **name** for the deployment, and optional tags.
39+
6. Configure the **network connectivity** settings for the deployment:
4040
- Attach to a **Private Network** for secure communication and restricted availability. Choose an existing Private Network from the drop-down list, or create a new one.
4141
- Set up **Public connectivity** to access resources via the public internet. Authentication by API key is enabled by default.
4242
<Message type="important">
4343
- Enabling both private and public connectivity will result in two distinct endpoints (public and private) for your deployment.
4444
- Deployments must have at least one endpoint, either public or private.
4545
</Message>
46-
8. Click **Deploy model** to launch the deployment process. Once the model is ready, it will be listed among your deployments.
46+
7. Click **Deploy model** to launch the deployment process. Once the model is ready, it will be listed among your deployments.
Lines changed: 11 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -1,17 +1,17 @@
11
---
22
title: How to delete a Managed Inference deployment
3-
description: This page explains how to delete a Managed Inference deployment
4-
tags: managed-inference ai-data deleting
3+
description: This page explains how to delete a Managed Inference deployment via the Scaleway console.
4+
tags: managed-inference ai-data delete
55
dates:
6-
validation: 2025-03-19
6+
validation: 2025-07-21
77
posted: 2024-03-06
88
categories:
99
- ai-data
1010
---
1111
import Requirements from '@macros/iam/requirements.mdx'
1212

1313

14-
Once you have finished your inference tasks you can delete your deployment. This page explains how to do so from the Scaleway console.
14+
Once you have finished your inference tasks, you can delete your deployment. This page explains how to do so from the Scaleway console.
1515

1616
<Requirements />
1717

@@ -21,11 +21,13 @@ Once you have finished your inference tasks you can delete your deployment. This
2121

2222
1. Click **Managed Inference** in the **AI** section of the [Scaleway console](https://console.scaleway.com) side menu. A list of your deployments displays.
2323
2. From the drop-down menu, select the geographical region you want to manage.
24-
3. Choose a deployment either by clicking its name or selecting **More info** from the drop-down menu represented by the icon <Icon name="more" /> to access the deployment dashboard.
25-
4. Click the **Settings** tab of your deployment to display additional settings.
26-
5. Click **Delete deployment**.
27-
6. Type **DELETE** to confirm and click **Delete deployment** to delete your deployment.
24+
3. Choose a deployment by clicking its name. The deployment's **Overview** page displays.
25+
4. Navigate to the **Settings** tab.
26+
5. Click **Delete deployment** at the bottom of the page.
27+
6. Type **DELETE** to confirm and click **Delete deployment**.
28+
29+
Alternatively, from the Deployments listing, click the <Icon name="more" /> icon next to the deployment name you no longer need, and click **Delete**. A pop-up appears. Type **DELETE** to confirm, then click **Delete deployment**.
2830

2931
<Message type="important">
30-
Deleting a deployment is a permanent action and will erase all its associated data.
32+
Deleting a deployment is a permanent action that erases all its associated data and resources.
3133
</Message>

pages/managed-inference/quickstart.mdx

Lines changed: 25 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ title: Managed Inference - Quickstart
33
description: Start with Scaleway Managed Inference for secure, scalable AI model deployment in Europe's premier platform. Privacy-focused, fully managed.
44
tags:
55
dates:
6-
validation: 2025-02-24
6+
validation: 2025-07-21
77
categories:
88
- ai-data
99
---
@@ -22,6 +22,11 @@ Here are some of the key features of Scaleway Managed Inference:
2222
* **Complete data privacy**: [No storage](/managed-inference/reference-content/data-privacy-security-scaleway-ai-services/#data-storage-policies) or third-party access to your data (prompt or responses), to ensure it remains exclusively yours.
2323
* **Interoperability**: Scaleway Managed Inference was designed as a drop-in [replacement for the OpenAI APIs](/managed-inference/reference-content/openai-compatibility/), for a seamless transition on your applications already using its libraries.
2424

25+
## Console overview
26+
27+
Discover the Managed Inference interface on the Scaleway console.
28+
<GuideFlow src="https://app.guideflow.com/embed/0p0ozjebvp"/>
29+
2530
<Requirements />
2631

2732
- A Scaleway account logged into the [console](https://console.scaleway.com)
@@ -38,11 +43,14 @@ Here are some of the key features of Scaleway Managed Inference:
3843
Scaleway Managed Inference allows you to deploy various AI models, either from the Scaleway catalog or by importing a custom model. For detailed information about supported models, visit our [Supported models in Managed Inference](/managed-inference/reference-content/supported-models/) documentation.
3944
</Message>
4045
<Message type="note">
41-
Some models may require acceptance of an end-user license agreement. If prompted, review the terms and conditions and accept the license accordingly.
46+
Some models may require acceptance of an end-user license agreement (EULA). If prompted, review the terms and conditions and accept the license accordingly.
4247
</Message>
4348
- Choose the geographical **region** for the deployment.
44-
- Specify the GPU Instance type to be used with your deployment.
49+
- Select a node type, the GPU Instance that will be used with your deployment.
4550
- Choose the number of nodes for your deployment. Note that this feature is currently in [Public Beta](https://www.scaleway.com/betas/).
51+
<Message type="note">
52+
High availability is only guaranteed with two or more nodes.
53+
</Message>
4654
5. Enter a **name** for the deployment, along with optional tags to aid in organization.
4755
6. Configure the **network** settings for the deployment:
4856
- Enable **Private Network** for secure communication and restricted availability within Private Networks. Choose an existing Private Network from the drop-down list, or create a new one.
@@ -59,9 +67,10 @@ Managed Inference deployments have authentication enabled by default. As such, y
5967

6068
1. Click **Managed Inference** in the **AI** section of the side menu. The Managed Inference dashboard displays.
6169
2. From the drop-down menu, select the geographical region where you want to manage.
62-
3. Click <Icon name="more" /> next to the deployment you want to edit. The deployment dashboard displays.
63-
4. Click **Generate key** in the **Deployment connection** section of the dashboard. The token creation wizard displays.
70+
3. Click the name of the deployment you wish to access. The deployment's **Overview** page displays.
71+
4. Scroll down to the **Deployment authentication** section and click the **Generate key** button. The token creation wizard displays.
6472
5. Fill in the [required information for API key creation](/iam/how-to/create-api-keys/) and click **Generate API key**.
73+
6. Copy and safely store your credentials before closing the window, as they will not be shown again.
6574

6675
<Message type="tip">
6776
You have full control over authentication from the **Security** tab of your deployment. Authentication is enabled by default.
@@ -70,9 +79,9 @@ Managed Inference deployments have authentication enabled by default. As such, y
7079
## How to interact with Managed Inference
7180

7281
1. Click **Managed Inference** in the **AI** section of the side menu. The Managed Inference dashboard displays.
73-
2. From the drop-down menu, select the geographical region where you want to manage.
74-
3. Click <Icon name="more" /> next to the deployment you want to edit. The deployment dashboard displays.
75-
4. Click the **Inference** tab. Code examples in various environments display. Copy and paste them into your code editor or terminal.
82+
2. From the drop-down menu, select the geographical region where your desired deployment was created.
83+
3. Click the name of the deployment you wish to edit. The deployment's **Overview** page displays.
84+
4. Click the **Playground** tab, then **View code** to see code examples in various environments. Copy and paste them into your code editor or terminal.
7685

7786
<Message type="note">
7887
Prompt structure may vary from one model to another. Refer to the specific instructions for use in our [dedicated documentation](/managed-inference/reference-content/).
@@ -81,12 +90,14 @@ Managed Inference deployments have authentication enabled by default. As such, y
8190
## How to delete a deployment
8291

8392
1. Click **Managed Inference** in the **AI** section of the [Scaleway console](https://console.scaleway.com) side menu. A list of your deployments displays.
84-
2. From the drop-down menu, select the geographical region where you want to create your deployment.
85-
3. Choose a deployment either by clicking its name or selecting **More info** from the drop-down menu represented by the icon <Icon name="more" /> to access the deployment dashboard.
86-
4. Click the **Settings** tab of your deployment to display additional settings.
87-
5. Click **Delete deployment**.
88-
6. Type **DELETE** to confirm and click **Delete deployment** to delete your deployment.
93+
2. From the drop-down menu, select the geographical region where your deployment was created.
94+
3. Click the name of the deployment you wish to delete.
95+
4. Navigate to the **Settings** tab.
96+
5. Click **Delete deployment** at the bottom of the page.
97+
6. Type **DELETE** to confirm and click **Delete deployment**.
98+
99+
Alternatively, from the Deployments listing, click the <Icon name="more" /> icon next to the deployment name you no longer need, and click **Delete**. A pop-up appears. Type **DELETE** to confirm, then click **Delete deployment**.
89100

90101
<Message type="important">
91-
Deleting a deployment is a permanent action, and will erase all its associated configuration and resources.
102+
Deleting a deployment is a permanent action that erases all its associated data and resources.
92103
</Message>

0 commit comments

Comments
 (0)