You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-foundry/model-inference/overview.md
+7-7Lines changed: 7 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,7 +7,7 @@ author: msakande
7
7
reviewer: santiagxf
8
8
ms.service: azure-ai-model-inference
9
9
ms.topic: concept-article
10
-
ms.date: 1/21/2025
10
+
ms.date: 01/24/2025
11
11
ms.author: mopeakande
12
12
ms.reviewer: fasantia
13
13
ms.custom: generated
@@ -20,7 +20,7 @@ Azure AI model inference provides access to the most powerful models available i
20
20
21
21
Azure AI model inference provides a way to **consume models as APIs without hosting them on your infrastructure**. Models are hosted in a Microsoft-managed infrastructure, which enables API-based access to the model provider's model. API-based access can dramatically reduce the cost of accessing a model and simplify the provisioning experience.
22
22
23
-
Azure AI model inference is part of Azure AI Services and users can access the service through [REST APIs](../../ai-studio/reference/reference-model-inference-api.md), [SDKs in several languages](supported-languages.md)including Python, C#, JavaScript, and Java. It can also be used from [Azure AI Foundry by configuring a connection](how-to/configure-project-connection.md).
23
+
Azure AI model inference is part of Azure AI Services, and users can access the service through [REST APIs](../../ai-studio/reference/reference-model-inference-api.md), [SDKs in several languages](supported-languages.md)such as Python, C#, JavaScript, and Java. You can also use the Azure AI model inference from [Azure AI Foundry by configuring a connection](how-to/configure-project-connection.md).
24
24
25
25
## Models
26
26
@@ -42,9 +42,9 @@ You can get access to the key model providers in the industry including OpenAI,
42
42
43
43
## Pricing
44
44
45
-
Models that are offered by non-Microsoft providers (for example, Meta AI and Mistral models) are billed through Azure Marketplace. For such models, you're required to subscribe to the particular model offering in accordance with the [Microsoft Commercial Marketplace Terms of Use](/legal/marketplace/marketplace-terms). Users accept license terms for use of the models. Pricing information for consumption is provided during deployment.
45
+
For models from non-Microsoft providers (for example, Meta AI and Mistral models), billing is through Azure Marketplace. For such models, you're required to subscribe to the particular model offering in accordance with the [Microsoft Commercial Marketplace Terms of Use](/legal/marketplace/marketplace-terms). Users accept license terms for use of the models. Pricing information for consumption is provided during deployment.
46
46
47
-
Models that are offered by Microsoft (for example, Phi-3 models and Azure OpenAI models) don't have this requirement, and they're billed via Azure meters as First Party Consumption Services. As described in the [Product Terms](https://www.microsoft.com/licensing/terms/welcome/welcomepage), you purchase First Party Consumption Services by using Azure meters, but they aren't subject to Azure service terms.
47
+
For Microsoft models (for example, Phi-3 models and Azure OpenAI models) billing is via Azure meters as First Party Consumption Services. As described in the [Product Terms](https://www.microsoft.com/licensing/terms/welcome/welcomepage), you purchase First Party Consumption Services by using Azure meters, but they aren't subject to Azure service terms.
48
48
49
49
> [!TIP]
50
50
> Learn how to [monitor and manage cost](how-to/manage-costs.md) in Azure AI model inference.
@@ -53,7 +53,7 @@ Models that are offered by Microsoft (for example, Phi-3 models and Azure OpenAI
53
53
54
54
At Microsoft, we're committed to the advancement of AI driven by principles that put people first. Generative models such as the ones available in Azure AI models have significant potential benefits, but without careful design and thoughtful mitigations, such models have the potential to generate incorrect or even harmful content.
55
55
56
-
Microsoft has made significant investments to help guard against abuse and unintended harm. These investments include:
56
+
Microsoft helps guard against abuse and unintended harm by taking the following actions:
57
57
58
58
- Incorporating Microsoft's [principles for responsible AI use](https://www.microsoft.com/ai/responsible-ai)
59
59
- Adopting a [code of conduct](/legal/cognitive-services/openai/code-of-conduct?context=/azure/ai-services/openai/context/context) for use of the service
@@ -62,9 +62,9 @@ Microsoft has made significant investments to help guard against abuse and unint
62
62
63
63
## Getting started
64
64
65
-
Azure AI Models is a new feature offering on Azure AI Services resources. You can get started with it the same way as any other Azure product where you [create and configure your resource for Azure AI model inference](how-to/quickstart-create-resources.md), or instance of the service, in your Azure Subscription. You can create as many resources as needed and configure them independently in case you have multiple teams with different requirements.
65
+
Azure AI model inference is a new feature offering on Azure AI Services resources. You can get started with it the same way as any other Azure product where you [create and configure your resource for Azure AI model inference](how-to/quickstart-create-resources.md), or instance of the service, in your Azure Subscription. You can create as many resources as needed and configure them independently in case you have multiple teams with different requirements.
66
66
67
-
Once you create an Azure AI Services resource, you must deploy a model before you can start making API calls. By default, no models are available on it so you can control which ones to start from. See the tutorial [Create your first model deployment in Azure AI model inference](how-to/create-model-deployments.md).
67
+
Once you create an Azure AI Services resource, you must deploy a model before you can start making API calls. By default, no models are available on it, so you can control which ones to start from. See the tutorial [Create your first model deployment in Azure AI model inference](how-to/create-model-deployments.md).
0 commit comments