Skip to content

Commit 4dc28b7

Browse files
authored
replace 'image' attribute by 'model' in 'models' definitions (#22992)
<!--Delete sections as needed --> ## Description Fix the Compose models examples which are wrongly referencing to `image` instead of `model` to define the LLM to use ## Related issues or tickets N/A <!-- Related issues, pull requests, or Jira tickets --> ## Reviews <!-- Notes for reviewers here --> <!-- List applicable reviews (optionally @tag reviewers) --> - [ ] Technical review - [x] Editorial review - [ ] Product review Signed-off-by: Guillaume Lours <[email protected]>
2 parents 3abd7a0 + 492a087 commit 4dc28b7

File tree

2 files changed

+11
-11
lines changed

2 files changed

+11
-11
lines changed

content/manuals/ai/compose/model-runner.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -40,7 +40,7 @@ services:
4040

4141
models:
4242
smollm2:
43-
image: ai/smollm2
43+
model: ai/smollm2
4444
```
4545
4646
### How it works
@@ -70,7 +70,7 @@ services:
7070
7171
models:
7272
smollm2:
73-
image: ai/smollm2
73+
model: ai/smollm2
7474
```
7575

7676
With this configuration, your `my-chat-app` service will receive:

content/manuals/ai/compose/models-and-compose.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -42,7 +42,7 @@ services:
4242

4343
models:
4444
llm:
45-
image: ai/smollm2
45+
model: ai/smollm2
4646
```
4747
4848
This example defines:
@@ -56,7 +56,7 @@ Models support various configuration options:
5656
```yaml
5757
models:
5858
llm:
59-
image: ai/smollm2
59+
model: ai/smollm2
6060
context_size: 1024
6161
runtime_flags:
6262
- "--a-flag"
@@ -87,9 +87,9 @@ services:
8787
8888
models:
8989
llm:
90-
image: ai/smollm2
90+
model: ai/smollm2
9191
embedding-model:
92-
image: ai/all-minilm
92+
model: ai/all-minilm
9393
```
9494

9595
With short syntax, the platform automatically generates environment variables based on the model name:
@@ -116,9 +116,9 @@ services:
116116
117117
models:
118118
llm:
119-
image: ai/smollm2
119+
model: ai/smollm2
120120
embedding-model:
121-
image: ai/all-minilm
121+
model: ai/all-minilm
122122
```
123123

124124
With this configuration, your service receives:
@@ -142,7 +142,7 @@ services:
142142
143143
models:
144144
llm:
145-
image: ai/smollm2
145+
model: ai/smollm2
146146
```
147147

148148
Docker Model Runner will:
@@ -163,9 +163,9 @@ services:
163163
164164
models:
165165
llm:
166-
image: ai/smollm2
166+
model: ai/smollm2
167167
# Cloud-specific configurations
168-
labels:
168+
x-cloud-options:
169169
- "cloud.instance-type=gpu-small"
170170
- "cloud.region=us-west-2"
171171
```

0 commit comments

Comments
 (0)