Skip to content

Commit b4d7929

Browse files
committed
Merge branch 'main' of github.com:MicrosoftDocs/azure-ai-docs-pr into sdg-patches
2 parents ce4b09d + d09db52 commit b4d7929

File tree

84 files changed

+1975
-1108
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

84 files changed

+1975
-1108
lines changed

.openpublishing.redirection.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -282,7 +282,7 @@
282282
},
283283
{
284284
"source_path": "articles/ai-services/index.yml",
285-
"redirect_url": "articles/azure/ai-foundry",
285+
"redirect_url": "/azure/ai-foundry",
286286
"redirect_document_id": false
287287
}
288288
]

articles/ai-foundry/concepts/fine-tuning-overview.md

Lines changed: 99 additions & 23 deletions
Large diffs are not rendered by default.

articles/ai-foundry/concepts/models-featured.md

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -348,6 +348,15 @@ The Stability AI collection of image generation models include Stable Image Core
348348
| [Stable Image Core](https://ai.azure.com/explore/models/Stable-Image-Core/version/1/registry/azureml-stabilityai) | Image generation | - **Input:** text (1000 tokens) <br /> - **Output:** 1 Image <br /> - **Tool calling:** No <br /> - **Response formats:** Image (PNG and JPG) |
349349
| [Stable Image Ultra](https://ai.azure.com/explore/models/Stable-Image-Ultra/version/1/registry/azureml-stabilityai) | Image generation | - **Input:** text (1000 tokens) <br /> - **Output:** 1 Image <br /> - **Tool calling:** No <br /> - **Response formats:** Image (PNG and JPG) |
350350

351+
### xAI
352+
353+
xAI's Grok 3 and Grok 3 Mini models are designed to excel in various enterprise domains. Grok 3, a non-reasoning model pre-trained by the Colossus datacenter, is tailored for business use cases such as data extraction, coding, and text summarization, with exceptional instruction-following capabilities. It supports a 131,072 token context window, allowing it to handle extensive inputs while maintaining coherence and depth, and is particularly adept at drawing connections across domains and languages. On the other hand, Grok 3 Mini is a lightweight reasoning model trained to tackle agentic, coding, mathematical, and deep science problems with test-time compute. It also supports a 131,072 token context window for understanding codebases and enterprise documents, and excels at using tools to solve complex logical problems in novel environments, offering raw reasoning traces for user inspection with adjustable thinking budgets.
354+
355+
| Model | Type | Capabilities |
356+
| ------ | ---- | ------------ |
357+
| [grok-3](https://ai.azure.com/explore/models/grok-3/version/1/registry/azureml-xai) | chat-completion | - **Input:** text (131,072 tokens) <br /> - **Output:** text (131,072 tokens) <br /> - **Languages:** `en` <br /> - **Tool calling:** yes <br /> - **Response formats:** text |
358+
| [grok-3-mini](https://ai.azure.com/explore/models/grok-3-mini/version/1/registry/azureml-xai) | chat-completion | - **Input:** text (131,072 tokens) <br /> - **Output:** text (131,072 tokens) <br /> - **Languages:** `en` <br /> - **Tool calling:** yes <br /> - **Response formats:** text |
359+
351360
#### Inference examples: Stability AI
352361

353362
Stability AI models deployed via standard deployment implement the Foundry Models API on the route `/image/generations`.

articles/ai-foundry/foundry-local/get-started.md

Lines changed: 14 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -35,17 +35,22 @@ Also, ensure you have administrative privileges to install software on your devi
3535

3636
Get started with Foundry Local quickly:
3737

38-
1. [**Download Foundry Local Installer**](https://aka.ms/foundry-local-installer) and **install** by following the on-screen prompts.
39-
> [!TIP]
40-
> If you're installing on Windows, you can also use `winget` to install Foundry Local. Open a terminal window and run the following command:
41-
>
42-
> ```powershell
43-
> winget install Microsoft.FoundryLocal
44-
> ```
38+
1. **Install Foundry Local**
39+
- **Windows**: Open a terminal and run the following command:
40+
```bash
41+
winget install Microsoft.FoundryLocal
42+
```
43+
- **macOS**: Open a terminal and run the following command:
44+
```bash
45+
brew tap microsoft/foundrylocal
46+
brew install foundrylocal
47+
```
48+
Alternatively, you can download the installer from the [Foundry Local GitHub repository](https://aka.ms/foundry-local-installer).
49+
4550
1. **Run your first model** Open a terminal window and run the following command to run a model:
4651

4752
```bash
48-
foundry model run deepseek-r1-1.5b
53+
foundry model run phi-3.5-mini
4954
```
5055

5156
The model downloads - which can take a few minutes, depending on your internet speed - and the model runs. Once the model is running, you can interact with it using the command line interface (CLI). For example, you can ask:
@@ -59,7 +64,7 @@ Get started with Foundry Local quickly:
5964

6065

6166
> [!TIP]
62-
> You can replace `deepseek-r1-1.5b` with any model name from the catalog (see `foundry model list` for available models). Foundry Local downloads the model variant that best matches your system's hardware and software configuration. For example, if you have an NVIDIA GPU, it downloads the CUDA version of the model. If you have a Qualcomm NPU, it downloads the NPU variant. If you have no GPU or NPU, it downloads the CPU version.
67+
> You can replace `phi-3.5-mini` with any model name from the catalog (see `foundry model list` for available models). Foundry Local downloads the model variant that best matches your system's hardware and software configuration. For example, if you have an NVIDIA GPU, it downloads the CUDA version of the model. If you have a Qualcomm NPU, it downloads the NPU variant. If you have no GPU or NPU, it downloads the CPU version.
6368
6469
## Explore commands
6570

articles/ai-foundry/foundry-local/includes/integrate-examples/javascript.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ import { FoundryLocalManager } from "foundry-local-sdk";
3232
// to your end-user's device.
3333
// TIP: You can find a list of available models by running the
3434
// following command in your terminal: `foundry model list`.
35-
const alias = "deepseek-r1-1.5b";
35+
const alias = "phi-3.5-mini";
3636

3737
// Create a FoundryLocalManager instance. This will start the Foundry
3838
// Local service if it is not already running.
@@ -83,7 +83,7 @@ import { FoundryLocalManager } from "foundry-local-sdk";
8383
// to your end-user's device.
8484
// TIP: You can find a list of available models by running the
8585
// following command in your terminal: `foundry model list`.
86-
const alias = "deepseek-r1-1.5b";
86+
const alias = "phi-3.5-mini";
8787

8888
// Create a FoundryLocalManager instance. This will start the Foundry
8989
// Local service if it is not already running.
@@ -133,7 +133,7 @@ import { FoundryLocalManager } from "foundry-local-sdk";
133133
// to your end-user's device.
134134
// TIP: You can find a list of available models by running the
135135
// following command in your terminal: `foundry model list`.
136-
const alias = "deepseek-r1-1.5b";
136+
const alias = "phi-3.5-mini";
137137

138138
// Create a FoundryLocalManager instance. This will start the Foundry
139139
// Local service if it is not already running.
@@ -176,7 +176,7 @@ import { FoundryLocalManager } from "foundry-local-sdk";
176176
// to your end-user's device.
177177
// TIP: You can find a list of available models by running the
178178
// following command in your terminal: `foundry model list`.
179-
const alias = "deepseek-r1-1.5b";
179+
const alias = "phi-3.5-mini";
180180

181181
// Create a FoundryLocalManager instance. This will start the Foundry
182182
// Local service if it is not already running.

articles/ai-foundry/foundry-local/includes/integrate-examples/python.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@ from foundry_local import FoundryLocalManager
3131

3232
# By using an alias, the most suitable model will be downloaded
3333
# to your end-user's device.
34-
alias = "deepseek-r1-1.5b"
34+
alias = "phi-3.5-mini"
3535

3636
# Create a FoundryLocalManager instance. This will start the Foundry
3737
# Local service if it is not already running and load the specified model.
@@ -66,7 +66,7 @@ from foundry_local import FoundryLocalManager
6666

6767
# By using an alias, the most suitable model will be downloaded
6868
# to your end-user's device.
69-
alias = "deepseek-r1-1.5b"
69+
alias = "phi-3.5-mini"
7070

7171
# Create a FoundryLocalManager instance. This will start the Foundry
7272
# Local service if it is not already running and load the specified model.
@@ -109,7 +109,7 @@ from foundry_local import FoundryLocalManager
109109

110110
# By using an alias, the most suitable model will be downloaded
111111
# to your end-user's device.
112-
alias = "deepseek-r1-1.5b"
112+
alias = "phi-3.5-mini"
113113

114114
# Create a FoundryLocalManager instance. This will start the Foundry
115115
# Local service if it is not already running and load the specified model.

articles/ai-foundry/foundry-local/includes/sdk-reference/javascript.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -92,7 +92,7 @@ import { FoundryLocalManager } from "foundry-local-sdk";
9292
// to your end-user's device.
9393
// TIP: You can find a list of available models by running the
9494
// following command in your terminal: `foundry model list`.
95-
const alias = "deepseek-r1-1.5b";
95+
const alias = "phi-3.5-mini";
9696

9797
const manager = new FoundryLocalManager()
9898

@@ -141,7 +141,7 @@ import { FoundryLocalManager } from "foundry-local-sdk";
141141
// to your end-user's device.
142142
// TIP: You can find a list of available models by running the
143143
// following command in your terminal: `foundry model list`.
144-
const alias = "deepseek-r1-1.5b";
144+
const alias = "phi-3.5-mini";
145145

146146
// Create a FoundryLocalManager instance. This will start the Foundry
147147
// Local service if it is not already running.
@@ -208,7 +208,7 @@ const endpoint = "ENDPOINT"
208208

209209
const manager = new FoundryLocalManager({serviceUrl: endpoint})
210210

211-
const alias = 'deepseek-r1-1.5b'
211+
const alias = 'phi-3.5-mini'
212212

213213
// Get all available models
214214
const catalog = await manager.listCatalogModels()

articles/ai-foundry/foundry-local/includes/sdk-reference/python.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -86,7 +86,7 @@ from foundry_local import FoundryLocalManager
8686

8787
# By using an alias, the most suitable model will be selected
8888
# to your end-user's device.
89-
alias = "deepseek-r1-1.5b"
89+
alias = "phi-3.5-mini"
9090

9191
# Create a FoundryLocalManager instance. This will start the Foundry.
9292
manager = FoundryLocalManager()
@@ -128,7 +128,7 @@ from foundry_local import FoundryLocalManager
128128

129129
# By using an alias, the most suitable model will be downloaded
130130
# to your end-user's device.
131-
alias = "deepseek-r1-1.5b"
131+
alias = "phi-3.5-mini"
132132

133133
# Create a FoundryLocalManager instance. This will start the Foundry
134134
# Local service if it is not already running and load the specified model.
-61.7 KB
Loading

articles/ai-foundry/foundry-local/reference/reference-rest.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -313,7 +313,7 @@ Retrieves all available models, including both local models and registered exter
313313

314314
- Response body
315315
```json
316-
["Phi-4-mini-instruct-generic-cpu", " deepseek-r1-distill-qwen-7b-generic-cpu"]
316+
["Phi-4-mini-instruct-generic-cpu", "phi-3.5-mini-instruct-generic-cpu"]
317317
```
318318

319319
### GET /openai/load/{name}
@@ -395,7 +395,7 @@ Retrieves a list of currently loaded models.
395395

396396
- Response body
397397
```json
398-
["Phi-4-mini-instruct-generic-cpu", " deepseek-r1-distill-qwen-7b-generic-cpu"]
398+
["Phi-4-mini-instruct-generic-cpu", "phi-3.5-mini-instruct-generic-cpu"]
399399
```
400400

401401
### GET /openai/getgpudevice

0 commit comments

Comments
 (0)