You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-foundry/foundry-models/faq.yml
+4-4Lines changed: 4 additions & 4 deletions
Original file line number
Diff line number
Diff line change
@@ -40,13 +40,13 @@ sections:
40
40
- question: |
41
41
Why aren't all the models in the Azure AI model catalog supported in Azure AI Foundry services?
42
42
answer: |
43
-
Foundry Models support all the models in the Azure AI catalog having Standard billing. For more information, see [the Models article](concepts/models.md).
43
+
Foundry Models support all the models in the Azure AI catalog having Standard billing. For more information, see [the Models article](../model-inference/concepts/models.md).
44
44
45
45
The Azure AI model catalog contains a wider list of models, however, those models require compute quota from your subscription. They also need to have a project or AI hub where to host the deployment. For more information, see [deployment options in Azure AI Foundry](../../ai-studio/concepts/deployments-overview.md).
46
46
- question: |
47
47
My company hasn't approved specific models for use. How can I prevent users from deploying them?
48
48
answer: |
49
-
You can restrict the models available for deployment in Azure AI services by using the Azure policies. Models are listed in the catalog but any attempt to deploy them is blocked. Read [Control AI model deployment with custom policies](how-to/configure-deployment-policies.md).
49
+
You can restrict the models available for deployment in Azure AI services by using the Azure policies. Models are listed in the catalog but any attempt to deploy them is blocked. Read [Control AI model deployment with custom policies](../model-inference/how-to/configure-deployment-policies.md).
50
50
- name: SDKs and programming languages
51
51
questions:
52
52
- question: |
@@ -56,7 +56,7 @@ sections:
56
56
57
57
Cohere SDK, Mistral SDK, and model provider-specific SDKs aren't supported when connected to Azure AI services.
58
58
59
-
For more information, see [supported SDKs and programming languages](supported-languages.md).
59
+
For more information, see [supported SDKs and programming languages](../model-inference/supported-languages.md).
60
60
- question: |
61
61
Do Foundry Models work with the latest Python library released by OpenAI (version>=1.0)?
62
62
answer: |
@@ -84,7 +84,7 @@ sections:
84
84
answer: |
85
85
You're billed for inputs and outputs to the APIs, typically in tokens. There are no cost associated with the resource itself or the deployments.
86
86
87
-
The token price varies per each model and you're billed per 1,000 tokens. You can see the pricing details before deploying a given model. For more information about billing, see [Manage cost](how-to/manage-costs.md).
87
+
The token price varies per each model and you're billed per 1,000 tokens. You can see the pricing details before deploying a given model. For more information about billing, see [Manage cost](../model-inference/how-to/manage-costs.md).
0 commit comments