Skip to content

Commit fd24ba4

Browse files
committed
Merge branch 'main' of https://github.com/MicrosoftDocs/azure-docs-pr into WI147804-mdc-m365
2 parents 1dea1ac + 1eba687 commit fd24ba4

File tree

1,038 files changed

+9599
-36071
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

1,038 files changed

+9599
-36071
lines changed

.openpublishing.redirection.json

Lines changed: 780 additions & 0 deletions
Large diffs are not rendered by default.

articles/ai-services/cognitive-services-virtual-networks.md

Lines changed: 40 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ manager: nitinme
88
ms.service: azure-ai-services
99
ms.custom: devx-track-azurepowershell, devx-track-azurecli
1010
ms.topic: how-to
11-
ms.date: 08/10/2023
11+
ms.date: 10/27/2023
1212
ms.author: aahi
1313
---
1414

@@ -171,7 +171,7 @@ You can manage default network access rules for Azure AI services resources thro
171171
172172
You can configure Azure AI services resources to allow access from specific subnets only. The allowed subnets might belong to a virtual network in the same subscription or in a different subscription. The other subscription can belong to a different Microsoft Entra tenant.
173173
174-
Enable a *service endpoint* for Azure AI services within the virtual network. The service endpoint routes traffic from the virtual network through an optimal path to the Azure AI services service. For more information, see [Virtual Network service endpoints](../virtual-network/virtual-network-service-endpoints-overview.md).
174+
Enable a *service endpoint* for Azure AI services within the virtual network. The service endpoint routes traffic from the virtual network through an optimal path to the Azure AI service. For more information, see [Virtual Network service endpoints](../virtual-network/virtual-network-service-endpoints-overview.md).
175175
176176
The identities of the subnet and the virtual network are also transmitted with each request. Administrators can then configure network rules for the Azure AI services resource to allow requests from specific subnets in a virtual network. Clients granted access by these network rules must continue to meet the authorization requirements of the Azure AI services resource to access the data.
177177
@@ -505,13 +505,13 @@ You can use [private endpoints](../private-link/private-endpoint-overview.md) fo
505505
506506
Private endpoints for Azure AI services resources let you:
507507
508-
- Secure your Azure AI services resource by configuring the firewall to block all connections on the public endpoint for the Azure AI services service.
508+
- Secure your Azure AI services resource by configuring the firewall to block all connections on the public endpoint for the Azure AI service.
509509
- Increase security for the virtual network, by enabling you to block exfiltration of data from the virtual network.
510510
- Securely connect to Azure AI services resources from on-premises networks that connect to the virtual network by using [Azure VPN Gateway](../vpn-gateway/vpn-gateway-about-vpngateways.md) or [ExpressRoutes](../expressroute/expressroute-locations.md) with private-peering.
511511
512512
### Understand private endpoints
513513
514-
A private endpoint is a special network interface for an Azure resource in your [virtual network](../virtual-network/virtual-networks-overview.md). Creating a private endpoint for your Azure AI services resource provides secure connectivity between clients in your virtual network and your resource. The private endpoint is assigned an IP address from the IP address range of your virtual network. The connection between the private endpoint and the Azure AI services service uses a secure private link.
514+
A private endpoint is a special network interface for an Azure resource in your [virtual network](../virtual-network/virtual-networks-overview.md). Creating a private endpoint for your Azure AI services resource provides secure connectivity between clients in your virtual network and your resource. The private endpoint is assigned an IP address from the IP address range of your virtual network. The connection between the private endpoint and the Azure AI service uses a secure private link.
515515
516516
Applications in the virtual network can connect to the service over the private endpoint seamlessly. Connections use the same connection strings and authorization mechanisms that they would use otherwise. The exception is Speech Services, which require a separate endpoint. For more information, see [Private endpoints with the Speech Services](#use-private-endpoints-with-the-speech-service) in this article. Private endpoints can be used with all protocols supported by the Azure AI services resource, including REST.
517517
@@ -560,6 +560,42 @@ For more information on configuring your own DNS server to support private endpo
560560
- [Name resolution that uses your own DNS server](../virtual-network/virtual-networks-name-resolution-for-vms-and-role-instances.md#name-resolution-that-uses-your-own-dns-server)
561561
- [DNS configuration](../private-link/private-endpoint-overview.md#dns-configuration)
562562
563+
## Grant access to trusted Azure services for Azure OpenAI
564+
565+
You can grant a subset of trusted Azure services access to Azure OpenAI, while maintaining network rules for other apps. These trusted services will then use managed identity to authenticate your Azure OpenAI service. The following table lists the services that can access Azure OpenAI if the managed identity of those services have the appropriate role assignment.
566+
567+
568+
|Service |Resource provider name |
569+
|---------|---------|
570+
|Azure AI Services | `Microsoft.CognitiveServices` |
571+
|Azure Machine Learning |`Microsoft.MachineLearningServices` |
572+
|Azure Cognitive Search | `Microsoft.Search` |
573+
574+
575+
You can grant networking access to trusted Azure services by creating a network rule exception using the REST API:
576+
```bash
577+
578+
accessToken=$(az account get-access-token --resource https://management.azure.com --query "accessToken" --output tsv)
579+
rid="/subscriptions/<your subscription id>/resourceGroups/<your resource group>/providers/Microsoft.CognitiveServices/accounts/<your Azure AI resource name>"
580+
581+
curl -i -X PATCH https://management.azure.com$rid?api-version=2023-10-01-preview \
582+
-H "Content-Type: application/json" \
583+
-H "Authorization: Bearer $accessToken" \
584+
-d \
585+
'
586+
{
587+
"properties":
588+
{
589+
"networkAcls": {
590+
"bypass": "AzureServices"
591+
}
592+
}
593+
}
594+
'
595+
```
596+
597+
To revoke the exception, set `networkAcls.bypass` to `None`.
598+
563599
### Pricing
564600

565601
For pricing details, see [Azure Private Link pricing](https://azure.microsoft.com/pricing/details/private-link).

articles/ai-services/containers/disconnected-container-faq.yml

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@ sections:
3131
* Azure AI Vision - Read
3232
* Document Intelligence
3333
34-
For more information about commitment tier pricing, reach out to your Microsoft account team or contact at Microsoft. You can also email csgate@microsoft.com.
34+
For more information about commitment tier pricing, reach out to your Microsoft account team or contact at Microsoft. You can also email azureaicontainergating@service.microsoft.com.
3535
- question: Will containers be available for other Azure AI services offerings, and what's the next set of containers I should expect?
3636
answer: |
3737
We'd like to make more Azure AI services offerings available as containers. Contact your local Microsoft account manager to get updates on new container releases and other Azure AI services announcements.
@@ -47,8 +47,8 @@ sections:
4747
* Application completed as instructed - Please pay close attention to guidance provided throughout the application to ensure you provide all the necessary information required for approval.
4848
- question: What if my use case can't satisfy the requirements listed above?
4949
answer: |
50-
If your use case can't satisfy above requirements but you're interested in running containers on premises, you may be able to use [connected containers](../cognitive-services-container-support.md).
51-
- question: What are some reasons my application may be denied?
50+
If your use case can't satisfy above requirements but you're interested in running containers on premises, you might be able to use [connected containers](../cognitive-services-container-support.md).
51+
- question: What are some reasons my application might be denied?
5252
answer: |
5353
Possible causes for a denied application are as follows:
5454
* Not being an existing Microsoft partner or enterprise agreement customer
@@ -62,7 +62,7 @@ sections:
6262
6363
Once your application is approved, the Azure AI services gating team will communicate details for purchasing the commitment tier pricing, and instructions to download and run the containers.
6464
65-
If you have any questions on the application, gating process or other information needed, email csgate@microsoft.com
65+
If you have any questions on the application, gating process or other information needed, email azureaicontainergating@service.microsoft.com
6666
- question: How do I download the disconnected containers?
6767
answer: |
6868
These containers are hosted on the Microsoft Container Registry and available for download on [Microsoft Artifact Registry](https://mcr.microsoft.com/catalog) and [Microsoft Container Registry (MCR)](https://mcr.microsoft.com/catalog?search=azure%20AI). You won't be able to run the container if your Azure subscription has not been approved after completion of the request form. Once you're approved, you can follow the steps in [Use Docker containers in disconnected environments](disconnected-containers.md)

articles/ai-services/language-service/conversational-language-understanding/concepts/best-practices.md

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -121,6 +121,14 @@ Once the request is sent, you can track the progress of the training job in Lang
121121
> [!NOTE]
122122
> You have to retrain your model after updating the `confidenceThreshold` project setting. Afterwards, you'll need to republish the app for the new threshold to take effect.
123123
124+
### Normalization in model version 2023-04-15
125+
126+
Model version 2023-04-15, conversational language understanding provides normalization in the inference layer that doesn't affect training.
127+
128+
The normalization layer normalizes the classification confidence scores to a confined range. The range selected currently is from `[-a,a]` where "a" is the square root of the number of intents. As a result, the normalization depends on the number of intents in the app. If there is a very low number of intents, the normalization layer has a very small range to work with. With a fairly large number of intents, the normalization is more effective.
129+
130+
If this normalization doesn’t seem to help intents that are out of scope to the extent that the confidence threshold can be used to filter out of scope utterances, it might be related to the number of intents in the app. Consider adding more intents to the app, or if you are using an orchestrated architecture, consider merging apps that belong to the same domain together.
131+
124132
## Debugging composed entities
125133

126134
Entities are functions that emit spans in your input with an associated type. The function is defined by one or more components. You can mark components as needed, and you can decide whether to enable the *combine components* setting. When you combine components, all spans that overlap will be merged into a single span. If the setting isn't used, each individual component span will be emitted.

articles/ai-services/language-service/custom-named-entity-recognition/includes/resource-creation-powershell.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -36,4 +36,4 @@ New-AzResourceGroupDeployment -Name ExampleDeployment -ResourceGroupName Example
3636
-TemplateParameterFile <path-to-parameters-file>
3737
```
3838

39-
See the ARM template documentation for information on [deploying templates](../../../../azure-resource-manager/templates/deploy-powershell.md#parameter-files) and [parameter files](../../../../azure-resource-manager/templates/parameter-files.md#parameter-file).
39+
See the ARM template documentation for information on [deploying templates](../../../../azure-resource-manager/templates/deploy-powershell.md#json-parameter-files) and [parameter files](../../../../azure-resource-manager/templates/parameter-files.md#parameter-file).

articles/ai-services/language-service/custom-text-classification/includes/resource-creation-powershell.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -34,4 +34,4 @@ New-AzResourceGroupDeployment -Name ExampleDeployment -ResourceGroupName Example
3434
-TemplateParameterFile <path-to-parameters-file>
3535
```
3636

37-
See the ARM template documentation for information on [deploying templates](../../../../azure-resource-manager/templates/deploy-powershell.md#parameter-files) and [parameter files](../../../../azure-resource-manager/templates/parameter-files.md?tabs=json).
37+
See the ARM template documentation for information on [deploying templates](../../../../azure-resource-manager/templates/deploy-powershell.md#json-parameter-files) and [parameter files](../../../../azure-resource-manager/templates/parameter-files.md?tabs=json).

articles/ai-services/language-service/includes/custom/resource-creation-powershell.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -36,4 +36,4 @@ New-AzResourceGroupDeployment -Name ExampleDeployment -ResourceGroupName Example
3636
-TemplateParameterFile <path-to-parameters-file>
3737
```
3838

39-
See the ARM template documentation for information on [deploying templates](../../../../azure-resource-manager/templates/deploy-powershell.md#parameter-files) and [parameter files](../../../../azure-resource-manager/templates/parameter-files.md#parameter-file).
39+
See the ARM template documentation for information on [deploying templates](../../../../azure-resource-manager/templates/deploy-powershell.md#json-parameter-files) and [parameter files](../../../../azure-resource-manager/templates/parameter-files.md#parameter-file).

articles/ai-services/openai/concepts/content-filter.md

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -58,7 +58,7 @@ The default content filtering configuration is set to filter at the medium sever
5858
| High | If approved<sup>\*</sup>| If approved<sup>\*</sup> | Content detected at severity levels low and medium isn't filtered. Only content at severity level high is filtered. Requires approval<sup>\*</sup>.|
5959
| No filters | If approved<sup>\*</sup>| If approved<sup>\*</sup>| No content is filtered regardless of severity level detected. Requires approval<sup>\*</sup>.|
6060

61-
<sup>\*</sup> Only customers who have been approved for modified content filtering have full content filtering control, including configuring content filters at severity level high only or turning content filters off. Apply for modified content filters via this form: [Azure OpenAI Limited Access Review: Modified Content Filters and Abuse Monitoring (microsoft.com)](https://customervoice.microsoft.com/Pages/ResponsePage.aspx?id=v4j5cvGGr0GRqy180BHbR7en2Ais5pxKtso_Pz4b1_xURE01NDY1OUhBRzQ3MkQxMUhZSE1ZUlJKTiQlQCN0PWcu)
61+
<sup>\*</sup> Only customers who have been approved for modified content filtering have full content filtering control, including configuring content filters at severity level high only or turning content filters off. Apply for modified content filters via this form: [Azure OpenAI Limited Access Review: Modified Content Filtering (microsoft.com)](https://customervoice.microsoft.com/Pages/ResponsePage.aspx?id=v4j5cvGGr0GRqy180BHbR7en2Ais5pxKtso_Pz4b1_xUMlBQNkZMR0lFRldORTdVQzQ0TEI5Q1ExOSQlQCN0PWcu)
6262

6363
Content filtering configurations are created within a Resource in Azure AI Studio, and can be associated with Deployments. [Learn more about configurability here](../how-to/content-filters.md).
6464

@@ -532,7 +532,9 @@ As part of your application design, consider the following best practices to del
532532
## Next steps
533533
534534
- Learn more about the [underlying models that power Azure OpenAI](../concepts/models.md).
535-
- Apply for modified content filters via [this form](https://customervoice.microsoft.com/Pages/ResponsePage.aspx?id=v4j5cvGGr0GRqy180BHbR7en2Ais5pxKtso_Pz4b1_xURE01NDY1OUhBRzQ3MkQxMUhZSE1ZUlJKTiQlQCN0PWcu).
535+
- Apply for modified content filters via [this form](https://customervoice.microsoft.com/Pages/ResponsePage.aspx?id=v4j5cvGGr0GRqy180BHbR7en2Ais5pxKtso_Pz4b1_xUMlBQNkZMR0lFRldORTdVQzQ0TEI5Q1ExOSQlQCN0PWcu).
536536
- Azure OpenAI content filtering is powered by [Azure AI Content Safety](https://azure.microsoft.com/products/cognitive-services/ai-content-safety).
537537
- Learn more about understanding and mitigating risks associated with your application: [Overview of Responsible AI practices for Azure OpenAI models](/legal/cognitive-services/openai/overview?context=/azure/ai-services/openai/context/context).
538538
- Learn more about how data is processed in connection with content filtering and abuse monitoring: [Data, privacy, and security for Azure OpenAI Service](/legal/cognitive-services/openai/data-privacy?context=/azure/ai-services/openai/context/context#preventing-abuse-and-harmful-content-generation).
539+
540+

articles/ai-services/openai/concepts/use-your-data.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -122,7 +122,7 @@ You can modify the following additional settings in the **Data parameters** sect
122122

123123
|Parameter name | Description |
124124
|---------|---------|
125-
|**Retrieved documents** | Specifies the number of top-scoring documents from your data index used to generate responses. You might want to increase the value when you have short documents or want to provide more context. The default value is 3. |
125+
|**Retrieved documents** | Specifies the number of top-scoring documents from your data index used to generate responses. You might want to increase the value when you have short documents or want to provide more context. The default value is 3. This is the `topNDocuments` parameter in the API. |
126126
| **Strictness** | Sets the threshold to categorize documents as relevant to your queries. Raising the value means a higher threshold for relevance and filters out more less-relevant documents for responses. Setting this value too high might cause the model to fail to generate responses due to limited available documents. The default value is 3. |
127127

128128
## Virtual network support & private endpoint support

articles/ai-services/openai/how-to/embeddings.md

Lines changed: 4 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -6,9 +6,9 @@ services: cognitive-services
66
manager: nitinme
77
ms.service: azure-ai-openai
88
ms.topic: how-to
9-
ms.date: 9/12/2023
10-
author: ChrisHMSFT
11-
ms.author: chrhoder
9+
ms.date: 11/02/2023
10+
author: mrbullwinkle
11+
ms.author: mbullwin
1212
recommendations: false
1313
keywords:
1414

@@ -75,15 +75,7 @@ foreach (float item in returnValue.Value.Data[0].Embedding)
7575

7676
### Verify inputs don't exceed the maximum length
7777

78-
The maximum length of input text for our embedding models is 2048 tokens (equivalent to around 2-3 pages of text). You should verify that your inputs don't exceed this limit before making a request.
79-
80-
### Choose the best model for your task
81-
82-
For the search models, you can obtain embeddings in two ways. The `<search_model>-doc` model is used for longer pieces of text (to be searched over) and the `<search_model>-query` model is used for shorter pieces of text, typically queries or class labels in zero shot classification. You can read more about all of the Embeddings models in our [Models](../concepts/models.md) guide.
83-
84-
### Replace newlines with a single space
85-
86-
Unless you're embedding code, we suggest replacing newlines (\n) in your input with a single space, as we have observed inferior results when newlines are present.
78+
The maximum length of input text for our latest embedding models is 8192 tokens. You should verify that your inputs don't exceed this limit before making a request.
8779

8880
## Limitations & risks
8981

0 commit comments

Comments
 (0)