Skip to content

Commit d46027f

Browse files
authored
Merge pull request #3194 from santiagxf/santiagxf/create-resource-refresh
Update create resource tutorial with images
2 parents c5c6b44 + a8c63ec commit d46027f

File tree

3 files changed

+18
-16
lines changed

3 files changed

+18
-16
lines changed

articles/ai-foundry/model-inference/includes/create-resources/bicep.md

Lines changed: 2 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -30,19 +30,9 @@ The files for this example are in:
3030
cd azureai-model-inference-bicep/infra
3131
```
3232

33-
## Understand the resources
34-
35-
The tutorial helps you create:
36-
37-
> [!div class="checklist"]
38-
> * An Azure AI Services resource.
39-
> * A model deployment in the Global standard SKU for each of the models supporting pay-as-you-go.
40-
> * (Optionally) An Azure AI project and hub.
41-
> * (Optionally) A connection between the hub and the models in Azure AI Services.
42-
43-
Notice that **you have to deploy an Azure AI project and hub** if you plan to use the Azure AI Foundry portal for managing the resource, using playground, or any other feature from the portal.
33+
## Create the resources
4434

45-
You are using the following assets to create those resources:
35+
Follow these steps:
4636

4737
1. Use the template `modules/ai-services-template.bicep` to describe your Azure AI Services resource:
4838

@@ -72,10 +62,6 @@ You are using the following assets to create those resources:
7262

7363
:::code language="bicep" source="~/azureai-model-inference-bicep/infra/modules/ai-services-connection-template.bicep":::
7464

75-
## Create the resources
76-
77-
In your console, follow these steps:
78-
7965
1. Define the main deployment:
8066

8167
__deploy-with-project.bicep__

articles/ai-foundry/model-inference/includes/create-resources/intro.md

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,22 @@ ms.topic: include
1111

1212
In this article, you learn how to create the resources required to use Azure AI model inference and consume flagship models from Azure AI model catalog.
1313

14+
## Understand the resources
15+
16+
Azure AI model inference is a capability in Azure AI Services resources in Azure. You can create model deployments under the resource to consume their predictions. You can also connect the resource to Azure AI Hubs and Projects in Azure AI Foundry to create intelligent applications if needed. The following picture shows the high level architecture.
17+
18+
:::image type="content" source="../../media/create-resources/resources-architecture.png" alt-text="A diagram showing the high level architecture of the resources created in the tutorial." lightbox="../../media/create-resources/resources-architecture.png":::
19+
20+
Azure AI Services resources don't require AI projects or AI hubs to operate and you can create them to consume flagship models from your applications. However, additional capabilities are available if you **deploy an Azure AI project and hub**, including playground, or agents.
21+
22+
The tutorial helps you create:
23+
24+
> [!div class="checklist"]
25+
> * An Azure AI Services resource.
26+
> * A model deployment for each of the models supported with pay-as-you-go.
27+
> * (Optionally) An Azure AI project and hub.
28+
> * (Optionally) A connection between the hub and the models in Azure AI Services.
29+
1430
## Prerequisites
1531

1632
To complete this article, you need:
56 KB
Loading

0 commit comments

Comments
 (0)