From c960132c84b9eec74cd4e4a936b437679082b61e Mon Sep 17 00:00:00 2001 From: fpagny Date: Tue, 17 Jun 2025 15:11:52 +0200 Subject: [PATCH 1/2] feat(genapi): add devstral model --- .../reference-content/model-catalog.mdx | 12 ++++++++++++ 1 file changed, 12 insertions(+) diff --git a/pages/managed-inference/reference-content/model-catalog.mdx b/pages/managed-inference/reference-content/model-catalog.mdx index b17a625c86..5d12b42548 100644 --- a/pages/managed-inference/reference-content/model-catalog.mdx +++ b/pages/managed-inference/reference-content/model-catalog.mdx @@ -35,6 +35,7 @@ A quick overview of available models in Scaleway's catalog and their core attrib | [`mistral-small-24b-instruct-2501`](#mistral-small-24b-instruct-2501) | Mistral | 32k | Text | L40S (20k), H100, H100-2 | [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0) | | [`mistral-nemo-instruct-2407`](#mistral-nemo-instruct-2407) | Mistral | 128k | Text | L40S, H100, H100-2 | [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0) | | [`mixtral-8x7b-instruct-v0.1`](#mixtral-8x7b-instruct-v01) | Mistral | 32k | Text | H100-2 | [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0) | +| [`devstral-small-2505`](#devstral-small-2505) | Mistral | 128k | Text | H100, H100-2 | [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0) | | [`moshiko-0.1-8b`](#moshiko-01-8b) | Kyutai | 4k | Audio to Audio | L4, H100 | [CC-BY-4.0](https://huggingface.co/datasets/choosealicense/licenses/blob/main/markdown/cc-by-4.0.md) | | [`moshika-0.1-8b`](#moshika-01-8b) | Kyutai | 4k | Audio to Audio| L4, H100 | [CC-BY-4.0](https://huggingface.co/datasets/choosealicense/licenses/blob/main/markdown/cc-by-4.0.md) | | [`pixtral-12b-2409`](#pixtral-12b-2409) | Mistral | 128k | Text, Vision | L40S (50k), H100, H100-2 | [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0) | @@ -61,6 +62,7 @@ A quick overview of available models in Scaleway's catalog and their core attrib | `mistral-small-24b-instruct-2501` | Yes | Yes | English, French, German, Dutch, Spanish, Italian, Polish, Portuguese, Chinese, Japanese, Korean | | `mistral-nemo-instruct-2407` | Yes | Yes | English, French, German, Spanish, Italian, Portuguese, Russian, Chinese, Japanese | | `mixtral-8x7b-instruct-v0.1` | Yes | No | English, French, German, Italian, Spanish | +| `devstral-small-2505` | Yes | Yes | English, French, German, Spanish, Portuguese, Italian, Japanese, Korean, Russian, Chinese, Arabic, Persian, Indonesian, Malay, Nepali, Polish, Romanian, Serbian, Swedish, Turkish, Ukrainian, Vietnamese, Hindi, Bengali | | `moshiko-0.1-8b` | No | No | English | | `moshika-0.1-8b` | No | No | English | | `pixtral-12b-2409` | Yes | Yes | English | @@ -252,6 +254,16 @@ It was trained on a large proportion of multilingual and code data. mistral/mistral-nemo-instruct-2407:fp8 ``` +### Devstral-small-2505 +Devstral Small is a fine-tune of Mistral Small 3.1, optimized to perform software engineering tasks. +It is a good fit to be used as coding agent, for instance in an IDE. + +#### Model name +``` +mistral/devstral-small-2505:fp8 +mistral/devstral-small-2505:bf16 +``` + ### Moshiko-0.1-8b Kyutai's Moshi is a speech-text foundation model for real-time dialogue. Moshi is an experimental next-generation conversational model, designed to understand and respond fluidly and naturally to complex conversations, while providing unprecedented expressiveness and spontaneity. From a447fa1f8e09a7dce6c1b3e4a5be1d4d91ca3c58 Mon Sep 17 00:00:00 2001 From: fpagny Date: Tue, 17 Jun 2025 16:07:48 +0200 Subject: [PATCH 2/2] feat(inference): add magistral to model catalog --- .../reference-content/model-catalog.mdx | 12 ++++++++++++ 1 file changed, 12 insertions(+) diff --git a/pages/managed-inference/reference-content/model-catalog.mdx b/pages/managed-inference/reference-content/model-catalog.mdx index 5d12b42548..c2da58a7e7 100644 --- a/pages/managed-inference/reference-content/model-catalog.mdx +++ b/pages/managed-inference/reference-content/model-catalog.mdx @@ -35,6 +35,7 @@ A quick overview of available models in Scaleway's catalog and their core attrib | [`mistral-small-24b-instruct-2501`](#mistral-small-24b-instruct-2501) | Mistral | 32k | Text | L40S (20k), H100, H100-2 | [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0) | | [`mistral-nemo-instruct-2407`](#mistral-nemo-instruct-2407) | Mistral | 128k | Text | L40S, H100, H100-2 | [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0) | | [`mixtral-8x7b-instruct-v0.1`](#mixtral-8x7b-instruct-v01) | Mistral | 32k | Text | H100-2 | [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0) | +| [`magistral-small-2506`](#magistral-small-2506) | Mistral | 32k | Text | L40S, H100, H100-2 | [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0) | | [`devstral-small-2505`](#devstral-small-2505) | Mistral | 128k | Text | H100, H100-2 | [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0) | | [`moshiko-0.1-8b`](#moshiko-01-8b) | Kyutai | 4k | Audio to Audio | L4, H100 | [CC-BY-4.0](https://huggingface.co/datasets/choosealicense/licenses/blob/main/markdown/cc-by-4.0.md) | | [`moshika-0.1-8b`](#moshika-01-8b) | Kyutai | 4k | Audio to Audio| L4, H100 | [CC-BY-4.0](https://huggingface.co/datasets/choosealicense/licenses/blob/main/markdown/cc-by-4.0.md) | @@ -62,6 +63,7 @@ A quick overview of available models in Scaleway's catalog and their core attrib | `mistral-small-24b-instruct-2501` | Yes | Yes | English, French, German, Dutch, Spanish, Italian, Polish, Portuguese, Chinese, Japanese, Korean | | `mistral-nemo-instruct-2407` | Yes | Yes | English, French, German, Spanish, Italian, Portuguese, Russian, Chinese, Japanese | | `mixtral-8x7b-instruct-v0.1` | Yes | No | English, French, German, Italian, Spanish | +| `magistral-small-2506` | Yes | Yes | English, French, German, Spanish, Portuguese, Italian, Japanese, Korean, Russian, Chinese, Arabic, Persian, Indonesian, Malay, Nepali, Polish, Romanian, Serbian, Swedish, Turkish, Ukrainian, Vietnamese, Hindi, Bengali | | `devstral-small-2505` | Yes | Yes | English, French, German, Spanish, Portuguese, Italian, Japanese, Korean, Russian, Chinese, Arabic, Persian, Indonesian, Malay, Nepali, Polish, Romanian, Serbian, Swedish, Turkish, Ukrainian, Vietnamese, Hindi, Bengali | | `moshiko-0.1-8b` | No | No | English | | `moshika-0.1-8b` | No | No | English | @@ -254,6 +256,16 @@ It was trained on a large proportion of multilingual and code data. mistral/mistral-nemo-instruct-2407:fp8 ``` +### Magistral-small-2506 +Magistral Small is a reasoning model, optimized to perform well on reasoning tasks such as academic or scientific questions. +It is well suited for complex tasks requiring multiple reasoning steps. + +#### Model name +``` +mistral/magistral-small-2506:fp8 +mistral/magistral-small-2506:bf16 +``` + ### Devstral-small-2505 Devstral Small is a fine-tune of Mistral Small 3.1, optimized to perform software engineering tasks. It is a good fit to be used as coding agent, for instance in an IDE.