Skip to content

Commit 055203a

Browse files
authored
Merge pull request #270356 from MicrosoftDocs/repo_sync_working_branch
Confirm merge from repo_sync_working_branch to main to sync with https://github.com/MicrosoftDocs/azure-docs (branch main)
2 parents 78596ee + 270eaa3 commit 055203a

File tree

6 files changed

+24
-22
lines changed

6 files changed

+24
-22
lines changed

articles/ai-services/document-intelligence/faq.yml

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -468,7 +468,7 @@ sections:
468468
469469
- Advanced
470470
471-
- **Contributor**: you need this role to create resource group or Document Intelligence resource.
471+
- **Contributor**: you need this role to create a resource group or Document Intelligence resource. The Contributor role doesn't allow you to list keys for Cognitive Services. To use Document Intelligence Studio, you still need the Cognitive Services User role.
472472
473473
- For custom model projects, here are the role requirements for user scenarios.
474474
@@ -480,10 +480,9 @@ sections:
480480
481481
- Advanced
482482
483-
- **Storage Account Contributor**: you need this role to the storage account to set up CORS settings (it's a one time effort if the same storage account is reused).
484-
485-
- **Contributor**: you need this role to create resource group and resources.
483+
- **Storage Account Contributor**: you need this role to the storage account to set up CORS settings (it's a one-time effort if the same storage account is reused). The Contributor role doesn't allow you to access data in your blob. To use Document Intelligence Studio, you still need the Storage Blob Data Contributor role.
486484
485+
- **Contributor**: you need this role to create a resource group and resources. The Contributor role doesn't give you access to use the created resources or storage. To use Document Intelligence Studio, you still need basic roles.
487486
488487
- For more information, *see* [Microsoft Entra built-in roles](../../role-based-access-control/built-in-roles.md) and Azure role assignments sections in [Document Intelligence Studio](quickstarts/try-document-intelligence-studio.md) page.
489488

articles/ai-services/document-intelligence/quickstarts/try-document-intelligence-studio.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ monikerRange: '>=doc-intel-3.0.0'
2828
* A [**Document Intelligence**](https://portal.azure.com/#create/Microsoft.CognitiveServicesFormRecognizer) or [**multi-service**](https://portal.azure.com/#create/Microsoft.CognitiveServicesAllInOne) resource.
2929

3030
> [!TIP]
31-
> Create an Azure AI services resource if you plan to access multiple Azure AI services under a single endpoint/key. For Document Intelligence access only, create a Document Intelligence resource. Please note that you'll need a single-service resource if you intend to use [Microsoft Entra authentication](../../../active-directory/authentication/overview-authentication.md).
31+
> Create an Azure AI services resource if you plan to access multiple Azure AI services under a single endpoint/key. For Document Intelligence access only, create a Document Intelligence resource. Currently [Microsoft Entra authentication](../../../active-directory/authentication/overview-authentication.md), is not supported on Document Intelligence Studio to access Document Intelligence service APIs. To use Document Intelligence Studio, enable access key authentication.
3232
3333
#### Azure role assignments
3434

articles/ai-services/openai/how-to/gpt-with-vision.md

Lines changed: 14 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -271,7 +271,7 @@ Send a POST request to `https://{RESOURCE_NAME}.openai.azure.com/openai/deployme
271271

272272
The format is similar to that of the chat completions API for GPT-4, but the message content can be an array containing strings and images (either a valid HTTP or HTTPS URL to an image, or a base-64-encoded image).
273273

274-
You must also include the `enhancements` and `dataSources` objects. `enhancements` represents the specific Vision enhancement features requested in the chat. It has a `grounding` and `ocr` property, which both have a boolean `enabled` property. Use these to request the OCR service and/or the object detection/grounding service. `dataSources` represents the Computer Vision resource data that's needed for Vision enhancement. It has a `type` property which should be `"AzureComputerVision"` and a `parameters` property. Set the `endpoint` and `key` to the endpoint URL and access key of your Computer Vision resource.
274+
You must also include the `enhancements` and `data_sources` objects. `enhancements` represents the specific Vision enhancement features requested in the chat. It has a `grounding` and `ocr` property, which both have a boolean `enabled` property. Use these to request the OCR service and/or the object detection/grounding service. `data_sources` represents the Computer Vision resource data that's needed for Vision enhancement. It has a `type` property which should be `"AzureComputerVision"` and a `parameters` property. Set the `endpoint` and `key` to the endpoint URL and access key of your Computer Vision resource.
275275

276276
> [!IMPORTANT]
277277
> Remember to set a `"max_tokens"` value, or the return output will be cut off.
@@ -287,7 +287,7 @@ You must also include the `enhancements` and `dataSources` objects. `enhancement
287287
"enabled": true
288288
}
289289
},
290-
"dataSources": [
290+
"data_sources": [
291291
{
292292
"type": "AzureComputerVision",
293293
"parameters": {
@@ -323,11 +323,11 @@ You must also include the `enhancements` and `dataSources` objects. `enhancement
323323

324324
#### [Python](#tab/python)
325325

326-
You call the same method as in the previous step, but include the new *extra_body* parameter. It contains the `enhancements` and `dataSources` fields.
326+
You call the same method as in the previous step, but include the new *extra_body* parameter. It contains the `enhancements` and `data_sources` fields.
327327

328328
`enhancements` represents the specific Vision enhancement features requested in the chat. It has a `grounding` and `ocr` field, which both have a boolean `enabled` property. Use these to request the OCR service and/or the object detection/grounding service.
329329

330-
`dataSources` represents the Computer Vision resource data that's needed for Vision enhancement. It has a `type` field which should be `"AzureComputerVision"` and a `parameters` field. Set the `endpoint` and `key` to the endpoint URL and access key of your Computer Vision resource. R
330+
`data_sources` represents the Computer Vision resource data that's needed for Vision enhancement. It has a `type` field which should be `"AzureComputerVision"` and a `parameters` field. Set the `endpoint` and `key` to the endpoint URL and access key of your Computer Vision resource. R
331331

332332
> [!IMPORTANT]
333333
> Remember to set a `"max_tokens"` value, or the return output will be cut off.
@@ -352,7 +352,7 @@ response = client.chat.completions.create(
352352
] }
353353
],
354354
extra_body={
355-
"dataSources": [
355+
"data_sources": [
356356
{
357357
"type": "AzureComputerVision",
358358
"parameters": {
@@ -583,7 +583,7 @@ To use a User assigned identity on your Azure AI Services resource, follow these
583583
"enabled": true
584584
}
585585
},
586-
"dataSources": [
586+
"data_sources": [
587587
{
588588
"type": "AzureComputerVisionVideoIndex",
589589
"parameters": {
@@ -616,15 +616,15 @@ To use a User assigned identity on your Azure AI Services resource, follow these
616616
}
617617
```
618618
619-
The request includes the `enhancements` and `dataSources` objects. `enhancements` represents the specific Vision enhancement features requested in the chat. `dataSources` represents the Computer Vision resource data that's needed for Vision enhancement. It has a `type` property which should be `"AzureComputerVisionVideoIndex"` and a `parameters` property which contains your AI Vision and video information.
619+
The request includes the `enhancements` and `data_sources` objects. `enhancements` represents the specific Vision enhancement features requested in the chat. `data_sources` represents the Computer Vision resource data that's needed for Vision enhancement. It has a `type` property which should be `"AzureComputerVisionVideoIndex"` and a `parameters` property which contains your AI Vision and video information.
620620
1. Fill in all the `<placeholder>` fields above with your own information: enter the endpoint URLs and keys of your OpenAI and AI Vision resources where appropriate, and retrieve the video index information from the earlier step.
621621
1. Send the POST request to the API endpoint. It should contain your OpenAI and AI Vision credentials, the name of your video index, and the ID and SAS URL of a single video.
622622

623623
#### [Python](#tab/python)
624624

625-
In your Python script, call the client's **create** method as in the previous sections, but include the *extra_body* parameter. Here, it contains the `enhancements` and `dataSources` fields. `enhancements` represents the specific Vision enhancement features requested in the chat. It has a `video` field, which has a boolean `enabled` property. Use this to request the video retrieval service.
625+
In your Python script, call the client's **create** method as in the previous sections, but include the *extra_body* parameter. Here, it contains the `enhancements` and `data_sources` fields. `enhancements` represents the specific Vision enhancement features requested in the chat. It has a `video` field, which has a boolean `enabled` property. Use this to request the video retrieval service.
626626
627-
`dataSources` represents the external resource data that's needed for Vision enhancement. It has a `type` field which should be `"AzureComputerVisionVideoIndex"` and a `parameters` field.
627+
`data_sources` represents the external resource data that's needed for Vision enhancement. It has a `type` field which should be `"AzureComputerVisionVideoIndex"` and a `parameters` field.
628628

629629
Set the `computerVisionBaseUrl` and `computerVisionApiKey` to the endpoint URL and access key of your Computer Vision resource. Set `indexName` to the name of your video index. Set `videoUrls` to a list of SAS URLs of your videos.
630630

@@ -648,7 +648,7 @@ response = client.chat.completions.create(
648648
] }
649649
],
650650
extra_body={
651-
"dataSources": [
651+
"data_sources": [
652652
{
653653
"type": "AzureComputerVisionVideoIndex",
654654
"parameters": {
@@ -672,12 +672,12 @@ print(response)
672672
---
673673
674674
> [!IMPORTANT]
675-
> The `"dataSources"` object's content varies depending on which Azure resource type and authentication method you're using. See the following reference:
675+
> The `"data_sources"` object's content varies depending on which Azure resource type and authentication method you're using. See the following reference:
676676
>
677677
> #### [Azure OpenAI resource](#tab/resource)
678678
>
679679
> ```json
680-
> "dataSources": [
680+
> "data_sources": [
681681
> {
682682
> "type": "AzureComputerVisionVideoIndex",
683683
> "parameters": {
@@ -692,7 +692,7 @@ print(response)
692692
> #### [Azure AIServices resource + SAS authentication](#tab/resource-sas)
693693
>
694694
> ```json
695-
> "dataSources": [
695+
> "data_sources": [
696696
> {
697697
> "type": "AzureComputerVisionVideoIndex",
698698
> "parameters": {
@@ -705,7 +705,7 @@ print(response)
705705
> #### [Azure AIServices resource + Managed Identities](#tab/resource-mi)
706706
>
707707
> ```json
708-
> "dataSources": [
708+
> "data_sources": [
709709
> {
710710
> "type": "AzureComputerVisionVideoIndex",
711711
> "parameters": {

articles/app-service/monitor-instances-health-check.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -20,9 +20,9 @@ Note that _/api/health_ is just an example added for illustration purposes. We d
2020
## What App Service does with Health checks
2121

2222
- When given a path on your app, Health check pings this path on all instances of your App Service app at 1-minute intervals.
23-
- If an instance doesn't respond with a status code between 200-299 (inclusive) after 10 requests, App Service determines it's unhealthy and removes it from the load balancer for this Web App. The required number of failed requests for an instance to be deemed unhealthy is configurable to a minimum of two requests.
23+
- If a web app that's running on a given instance doesn't respond with a status code between 200-299 (inclusive) after 10 requests, App Service determines it's unhealthy and removes it from the load balancer for this Web App. The required number of failed requests for an instance to be deemed unhealthy is configurable to a minimum of two requests.
2424
- After removal, Health check continues to ping the unhealthy instance. If the instance begins to respond with a healthy status code (200-299), then the instance is returned to the load balancer.
25-
- If an instance remains unhealthy for one hour, it's replaced with a new instance.
25+
- If the web app that's running on an instance remains unhealthy for one hour, the instance is replaced with a new one.
2626
- When scaling out, App Service pings the Health check path to ensure new instances are ready.
2727

2828
> [!NOTE]

articles/programmable-connectivity/azure-programmable-connectivity-using-network-apis.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ Create an APC Gateway, following instructions in [Create an APC Gateway](azure-p
2424
## Obtain an authentication token
2525

2626
1. Follow the instructions at [How to create a Service Principal](/entra/identity-platform/howto-create-service-principal-portal) to create an App Registration that can be used to access your APC Gateway.
27-
- For the step "Assign a role to the application", go to the APC Gateway in the Azure portal and follow the instructions from `3. Select Access Control (IAM)` onwards. Assign the new App registration `Azure Programmable Connectivity Gateway User` and `Contributor` roles.
27+
- For the step "Assign a role to the application", go to the APC Gateway in the Azure portal and follow the instructions from `3. Select Access Control (IAM)` onwards. Assign the new App registration the `Azure Programmable Connectivity Gateway Dataplane User` role.
2828
- At the step "Set up authentication", select "Option 3: Create a new client secret". Note the value of the secret as `CLIENT_SECRET`, and store it securely (for example in an Azure Key Vault).
2929
- After you have created the App registration, copy the value of Client ID from the Overview page, and note it as `CLIENT_ID`.
3030
2. Navigate to "Tenant Properties" in the Azure portal. Copy the value of Tenant ID, and note it as `TENANT`.

articles/virtual-network/virtual-networks-name-resolution-for-vms-and-role-instances.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -367,6 +367,9 @@ When you're using your own DNS servers, Azure enables you to specify multiple DN
367367
> [!NOTE]
368368
> Network connection properties, such as DNS server IPs, should not be edited directly within VMs. This is because they might get erased during service heal when the virtual network adaptor gets replaced. This applies to both Windows and Linux VMs.
369369

370+
> [!NOTE]
371+
> Modifying the DNS suffix settings directly within the VMs can disrupt network connectivity, potentially causing traffic to the VMs to be interrupted or lost. To resolve this issue, a restart of the VMs is necessary.
372+
370373
When you're using the Azure Resource Manager deployment model, you can specify DNS servers for a virtual network and a network interface. For details, see [Manage a virtual network](manage-virtual-network.md) and [Manage a network interface](virtual-network-network-interface.md).
371374
372375
> [!NOTE]

0 commit comments

Comments
 (0)