Skip to content

Commit d4bff0c

Browse files
committed
Merge branch 'release-build-2025-release' of github.com:MicrosoftDocs/azure-ai-docs-pr into sdg-fix-qs
2 parents 88a0f14 + e8f7405 commit d4bff0c

30 files changed

+1105
-670
lines changed

articles/ai-foundry/concepts/fine-tuning-overview.md

Lines changed: 99 additions & 23 deletions
Large diffs are not rendered by default.

articles/ai-foundry/concepts/models-featured.md

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -358,6 +358,15 @@ For examples of how to use Stability AI models, see the following examples:
358358
- [Use Requests library with Stable Diffusion 3.5 Large for image to image requests](https://github.com/Azure/azureml-examples/blob/main/sdk/python/foundation-models/stabilityai/Image_to_Image.ipynb)
359359
- [Example of a fully encoded image generation response](https://github.com/Azure/azureml-examples/blob/main/sdk/python/foundation-models/stabilityai/Sample_image_generation_response.txt)
360360

361+
### xAI
362+
363+
**Grok** is a family of models designed...
364+
365+
| Model | Type | Tier | Capabilities |
366+
| ------ | ---- | --- | ------------ |
367+
| [Grok]() | | Global standard | - **Input:** text (0 tokens) <br /> - **Output:** text (0 tokens) <br /> - **Languages:** <br /> - **Tool calling:** <br /> - **Response formats:** |
368+
| [grok-3-mini]() | | Global standard | - **Input:** text (0 tokens) <br /> - **Output:** text (0 tokens) <br /> - **Languages:** <br /> - **Tool calling:** <br /> - **Response formats:** |
369+
361370

362371
## Related content
363372

articles/ai-foundry/foundry-local/get-started.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@ Get started with Foundry Local quickly:
4545
1. **Run your first model** Open a terminal window and run the following command to run a model:
4646
4747
```bash
48-
foundry model run deepseek-r1-1.5b
48+
foundry model run phi-3.5-mini
4949
```
5050
5151
The model downloads - which can take a few minutes, depending on your internet speed - and the model runs. Once the model is running, you can interact with it using the command line interface (CLI). For example, you can ask:
@@ -59,7 +59,7 @@ Get started with Foundry Local quickly:
5959
6060
6161
> [!TIP]
62-
> You can replace `deepseek-r1-1.5b` with any model name from the catalog (see `foundry model list` for available models). Foundry Local downloads the model variant that best matches your system's hardware and software configuration. For example, if you have an NVIDIA GPU, it downloads the CUDA version of the model. If you have a Qualcomm NPU, it downloads the NPU variant. If you have no GPU or NPU, it downloads the CPU version.
62+
> You can replace `phi-3.5-mini` with any model name from the catalog (see `foundry model list` for available models). Foundry Local downloads the model variant that best matches your system's hardware and software configuration. For example, if you have an NVIDIA GPU, it downloads the CUDA version of the model. If you have a Qualcomm NPU, it downloads the NPU variant. If you have no GPU or NPU, it downloads the CPU version.
6363
6464
## Explore commands
6565

articles/ai-foundry/foundry-local/includes/integrate-examples/javascript.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ import { FoundryLocalManager } from "foundry-local-sdk";
3232
// to your end-user's device.
3333
// TIP: You can find a list of available models by running the
3434
// following command in your terminal: `foundry model list`.
35-
const alias = "deepseek-r1-1.5b";
35+
const alias = "phi-3.5-mini";
3636

3737
// Create a FoundryLocalManager instance. This will start the Foundry
3838
// Local service if it is not already running.
@@ -83,7 +83,7 @@ import { FoundryLocalManager } from "foundry-local-sdk";
8383
// to your end-user's device.
8484
// TIP: You can find a list of available models by running the
8585
// following command in your terminal: `foundry model list`.
86-
const alias = "deepseek-r1-1.5b";
86+
const alias = "phi-3.5-mini";
8787

8888
// Create a FoundryLocalManager instance. This will start the Foundry
8989
// Local service if it is not already running.
@@ -133,7 +133,7 @@ import { FoundryLocalManager } from "foundry-local-sdk";
133133
// to your end-user's device.
134134
// TIP: You can find a list of available models by running the
135135
// following command in your terminal: `foundry model list`.
136-
const alias = "deepseek-r1-1.5b";
136+
const alias = "phi-3.5-mini";
137137

138138
// Create a FoundryLocalManager instance. This will start the Foundry
139139
// Local service if it is not already running.
@@ -176,7 +176,7 @@ import { FoundryLocalManager } from "foundry-local-sdk";
176176
// to your end-user's device.
177177
// TIP: You can find a list of available models by running the
178178
// following command in your terminal: `foundry model list`.
179-
const alias = "deepseek-r1-1.5b";
179+
const alias = "phi-3.5-mini";
180180

181181
// Create a FoundryLocalManager instance. This will start the Foundry
182182
// Local service if it is not already running.

articles/ai-foundry/foundry-local/includes/integrate-examples/python.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@ from foundry_local import FoundryLocalManager
3131

3232
# By using an alias, the most suitable model will be downloaded
3333
# to your end-user's device.
34-
alias = "deepseek-r1-1.5b"
34+
alias = "phi-3.5-mini"
3535

3636
# Create a FoundryLocalManager instance. This will start the Foundry
3737
# Local service if it is not already running and load the specified model.
@@ -66,7 +66,7 @@ from foundry_local import FoundryLocalManager
6666

6767
# By using an alias, the most suitable model will be downloaded
6868
# to your end-user's device.
69-
alias = "deepseek-r1-1.5b"
69+
alias = "phi-3.5-mini"
7070

7171
# Create a FoundryLocalManager instance. This will start the Foundry
7272
# Local service if it is not already running and load the specified model.
@@ -109,7 +109,7 @@ from foundry_local import FoundryLocalManager
109109

110110
# By using an alias, the most suitable model will be downloaded
111111
# to your end-user's device.
112-
alias = "deepseek-r1-1.5b"
112+
alias = "phi-3.5-mini"
113113

114114
# Create a FoundryLocalManager instance. This will start the Foundry
115115
# Local service if it is not already running and load the specified model.

articles/ai-foundry/foundry-local/includes/sdk-reference/javascript.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -92,7 +92,7 @@ import { FoundryLocalManager } from "foundry-local-sdk";
9292
// to your end-user's device.
9393
// TIP: You can find a list of available models by running the
9494
// following command in your terminal: `foundry model list`.
95-
const alias = "deepseek-r1-1.5b";
95+
const alias = "phi-3.5-mini";
9696

9797
const manager = new FoundryLocalManager()
9898

@@ -141,7 +141,7 @@ import { FoundryLocalManager } from "foundry-local-sdk";
141141
// to your end-user's device.
142142
// TIP: You can find a list of available models by running the
143143
// following command in your terminal: `foundry model list`.
144-
const alias = "deepseek-r1-1.5b";
144+
const alias = "phi-3.5-mini";
145145

146146
// Create a FoundryLocalManager instance. This will start the Foundry
147147
// Local service if it is not already running.
@@ -208,7 +208,7 @@ const endpoint = "ENDPOINT"
208208

209209
const manager = new FoundryLocalManager({serviceUrl: endpoint})
210210

211-
const alias = 'deepseek-r1-1.5b'
211+
const alias = 'phi-3.5-mini'
212212

213213
// Get all available models
214214
const catalog = await manager.listCatalogModels()

articles/ai-foundry/foundry-local/includes/sdk-reference/python.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -86,7 +86,7 @@ from foundry_local import FoundryLocalManager
8686

8787
# By using an alias, the most suitable model will be selected
8888
# to your end-user's device.
89-
alias = "deepseek-r1-1.5b"
89+
alias = "phi-3.5-mini"
9090

9191
# Create a FoundryLocalManager instance. This will start the Foundry.
9292
manager = FoundryLocalManager()
@@ -128,7 +128,7 @@ from foundry_local import FoundryLocalManager
128128

129129
# By using an alias, the most suitable model will be downloaded
130130
# to your end-user's device.
131-
alias = "deepseek-r1-1.5b"
131+
alias = "phi-3.5-mini"
132132

133133
# Create a FoundryLocalManager instance. This will start the Foundry
134134
# Local service if it is not already running and load the specified model.
-61.7 KB
Loading

articles/ai-foundry/foundry-local/reference/reference-rest.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -313,7 +313,7 @@ Retrieves all available models, including both local models and registered exter
313313

314314
- Response body
315315
```json
316-
["Phi-4-mini-instruct-generic-cpu", " deepseek-r1-distill-qwen-7b-generic-cpu"]
316+
["Phi-4-mini-instruct-generic-cpu", "phi-3.5-mini-instruct-generic-cpu"]
317317
```
318318

319319
### GET /openai/load/{name}
@@ -395,7 +395,7 @@ Retrieves a list of currently loaded models.
395395

396396
- Response body
397397
```json
398-
["Phi-4-mini-instruct-generic-cpu", " deepseek-r1-distill-qwen-7b-generic-cpu"]
398+
["Phi-4-mini-instruct-generic-cpu", "phi-3.5-mini-instruct-generic-cpu"]
399399
```
400400

401401
### GET /openai/getgpudevice

articles/ai-foundry/includes/region-availability-maas.md

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -114,6 +114,13 @@ TimeGEN-1 | [Microsoft Managed Countries/Regions](/partner-center/marketplace/t
114114
|---------|---------|---------|---------|
115115
tsuzumi-7b | [Microsoft Managed Countries/Regions](/partner-center/marketplace/tax-details-marketplace#microsoft-managed-countriesregions) | East US 2 <br> South Central US <br> East US <br> West US 3 <br> West US <br> North Central US | East US 2 <br> East US <br> North Central US <br> South Central US <br> West US <br> West US 3 |
116116

117+
### xAI
118+
119+
| Model | Offer Availability Region | Hub/Project Region for Deployment | Hub/Project Region for Fine tuning |
120+
|---------|---------|---------|---------|
121+
grok-3 | [Microsoft Managed Countries/Regions](/partner-center/marketplace/tax-details-marketplace#microsoft-managed-countriesregions) | | Not available |
122+
grok-3-mini | [Microsoft Managed Countries/Regions](/partner-center/marketplace/tax-details-marketplace#microsoft-managed-countriesregions) | | Not available |
123+
117124

118125
### Stability AI models
119126

0 commit comments

Comments
 (0)