Skip to content

Commit a1de5ec

Browse files
committed
Merge branch 'main' of https://github.com/MicrosoftDocs/azure-docs-pr into patricka-cert-feedback
2 parents 56796ca + 6e395fd commit a1de5ec

24 files changed

+343
-188
lines changed

articles/ai-services/computer-vision/includes/how-to-guides/analyze-image-40-rest.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@ To authenticate against the Image Analysis service, you need a Computer Vision k
2525
The SDK example assumes that you defined the environment variables `VISION_KEY` and `VISION_ENDPOINT` with your key and endpoint.
2626

2727

28-
Authentication is done by adding the HTTP request header **Ocp-Apim-Subscription-Key** and setting it to your vision key. The call is made to the URL `https://<endpoint>/computervision/imageanalysis:analyze&api-version=2023-10-01`, where `<endpoint>` is your unique computer vision endpoint URL. You add query strings based on your analysis options.
28+
Authentication is done by adding the HTTP request header **Ocp-Apim-Subscription-Key** and setting it to your vision key. The call is made to the URL `https://<endpoint>/computervision/imageanalysis:analyze?api-version=2023-10-01`, where `<endpoint>` is your unique computer vision endpoint URL. You add query strings based on your analysis options.
2929

3030

3131
## Select the image to analyze

articles/ai-services/computer-vision/reference-video-search.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -334,6 +334,7 @@ Represents the create ingestion request model for the JSON document.
334334
| moderation | boolean | Gets or sets the moderation flag, indicating if the content should be moderated. | No |
335335
| generateInsightIntervals | boolean | Gets or sets the interval generation flag, indicating if insight intervals should be generated. | No |
336336
| filterDefectedFrames | boolean | Frame filter flag indicating frames will be evaluated and all defected (e.g. blurry, lowlight, overexposure) frames will be filtered out. | No |
337+
| includeSpeechTranscript | boolean | Gets or sets the transcript generation flag, indicating if transcript should be generated. | No |
337338

338339
### DatetimeFilterModel
339340

articles/ai-services/openai/how-to/gpt-with-vision.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -347,18 +347,18 @@ Every response includes a `"finish_details"` field. The subfield `"type"` has th
347347

348348
If `finish_details.type` is `stop`, then there is another `"stop"` property that specifies the token that caused the output to end.
349349

350-
## Low or high fidelity image understanding
350+
## Detail parameter settings in image processing: Low, High, Auto
351351

352-
By controlling the _detail_ parameter, which has two options, `low` or `high`, you can control how the model processes the image and generates its textual understanding.
353-
- `low` disables the "high res" mode. The model receives a low-res 512x512 version of the image and represents the image with a budget of 65 tokens. This allows the API to return faster responses and consume fewer input tokens for use cases that don't require high detail.
354-
- `high` enables "high res" mode, which first allows the model to see the low res image and then creates detailed crops of input images as 512x512 squares based on the input image size. Each of the detailed crops uses twice the token budget (65 tokens) for a total of 129 tokens.
352+
The detail parameter in the model offers three choices: `low`, `high`, or `auto`, to adjust the way the model interprets and processes images. The default setting is auto, where the model decides between low or high based on the size of the image input.
353+
- `low` setting: the model does not activate the "high res" mode, instead processing a lower resolution 512x512 version of the image using 65 tokens, resulting in quicker responses and reduced token consumption for scenarios where fine detail isn't crucial.
354+
- `high` setting activates "high res" mode. Here, the model initially views the low-resolution image and then generates detailed 512x512 segments from the input image. Each segment uses double the token budget, amounting to 129 tokens per segment, allowing for a more detailed interpretation of the image.
355355

356356
## Limitations
357357

358358
### Image support
359359

360360
- **Limitation on image enhancements per chat session**: Enhancements cannot be applied to multiple images within a single chat call.
361-
- **Maximum input image size**: The maximum size for input images is restricted to 4 MB.
361+
- **Maximum input image size**: The maximum size for input images is restricted to 20 MB.
362362
- **Object grounding in enhancement API**: When the enhancement API is used for object grounding, and the model detects duplicates of an object, it will generate one bounding box and label for all the duplicates instead of separate ones for each.
363363
- **Low resolution accuracy**: When images are analyzed using the "low resolution" setting, it allows for faster responses and uses fewer input tokens for certain use cases. However, this could impact the accuracy of object and text recognition within the image.
364364
- **Image chat restriction**: When uploading images in the chat playground or the API, there is a limit of 10 images per chat call.

articles/ai-services/openai/includes/gpt-v-rest.md

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -61,11 +61,11 @@ Create a new Python file named _quickstart.py_. Open the new file in your prefer
6161
endpoint = f"{base_url}/chat/completions?api-version=2023-12-01-preview"
6262
data = {
6363
"messages": [
64-
{ "role": "system", "content": "You are a helpful assistant." }, # Content can be a string, OR
65-
{ "role": "user", "content": [ # It can be an array containing strings and images.
66-
"Describe this picture:",
67-
{ "image": "<base_64_encoded_image>" } # Images are represented like this.
68-
] }
64+
{ "role": "system", "content": "You are a helpful assistant." },
65+
{ "role": "user", "content": [
66+
{ "type": "text", "text": "Describe this picture:" },
67+
{ "type": "image_url", "url": "<URL or base-64-encoded image>" }
68+
] }
6969
],
7070
"max_tokens": 100
7171
}
@@ -136,11 +136,11 @@ The **object grounding** integration brings a new layer to data analysis and use
136136
}
137137
}],
138138
"messages": [
139-
{ "role": "system", "content": "You are a helpful assistant." }, # Content can be a string, OR
140-
{ "role": "user", "content": [ # It can be an array containing strings and images.
141-
"Describe this picture:",
142-
{ "image": "<base_64_encoded_image>" } # Images are represented like this.
143-
]}
139+
{ "role": "system", "content": "You are a helpful assistant." },
140+
{ "role": "user", "content": [
141+
{ "type": "text", "text": "Describe this picture:" },
142+
{ "type": "image_url", "url": "<URL or base-64-encoded image>" }
143+
]}
144144
],
145145
"max_tokens": 100
146146
}

articles/ai-studio/concepts/ai-resources.md

Lines changed: 7 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -18,9 +18,14 @@ ms.author: eur
1818

1919
In Azure, resources enable access to Azure services for individuals and teams. Access to many Azure AI capabilities is available via a unified resource called Azure AI.
2020

21-
An Azure AI resource can be used to access multiple Azure AI services. The Azure AI resource provides a hosted environment for teams to organize their [Azure AI project](#project-assets) work in, and is configurable with enterprise-grade security controls, which are passed down to each project environment. The Azure AI resource doesn't directly contain the keys and endpoints needed to authenticate your requests to Azure AI services. Instead, the Azure AI resource contains an [Azure AI services](#azure-ai-services-resource-keys) resource with keys and endpoints that you use to access Azure AI services.
21+
The preview 'Azure AI' resource used in AI studio can be used to access multiple Azure AI services with a single setup. Previously, different Azure AI services including [Azure OpenAI](../../ai-services/openai/overview.md), [Azure Machine Learning](../../machine-learning/overview-what-is-azure-machine-learning.md), [Azure Speech](../../ai-services/speech-service/overview.md), required their individual setup.
2222

23-
In this article, you learn more about its capabilities, and how to set up Azure AI for your organization. You can see the resources that have created in the [Azure portal](https://portal.azure.com/) and in [Azure AI Studio](https://ai.azure.com).
23+
The AI resource provides the working environment for a team to build and manage AI applications, catering to two persona:
24+
25+
* To AI developers, the Azure AI resource provides the working environment for building AI applications granting access to various tools for AI model building. Tools can be used together, and lets you use and produce shareable components including datasets, indexes, models. AI resources allows you to configure connections to external resources, provide compute resources used by tools and [endpoints and access keys to pre-built AI models](#azure-ai-services-resource-keys).
26+
* To IT administrators and team leads, the Azure AI resource provides a single pane of glass on projects created by a team, audit connections that are in use to external resources, and additional governance controls to help meet cost and compliance requirements. Security settings are configured on the Azure AI resource, and once set up apply to all projects created under it, allowing administrators to enable developers to self-serve create projects to organize work.
27+
28+
In this article, you learn more about Azure AI resource's capabilities, and how to set up Azure AI for your organization. You can see the resources that have created in the [Azure portal](https://portal.azure.com/) and in [Azure AI Studio](https://ai.azure.com).
2429

2530
## Unified assets across projects
2631

articles/azure-arc/data/includes/azure-arc-data-preview-release.md

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -5,13 +5,13 @@ ms.service: azure-arc
55
ms.custom:
66
- ignite-2023
77
ms.topic: include
8-
ms.date: 12/06/2023
8+
ms.date: 12/12/2023
99
---
1010

11-
<!--
1211
At this time, a test or preview build is not available for the next release.
13-
-->
1412

13+
<!--
14+
1515
Dec 2023 preview release is now available.
1616
1717
|Component|Value|
@@ -46,3 +46,5 @@ Dec 2023 preview release is now available.
4646
Arc SQL Server | Show the Data Processing Service (DPS) connectivity status in the Azure portal | GA
4747
4848
Arc SQL Server | Monitoring | Add IOPS, Queue Latency Storage IO charts in Performance Dashboard in Azure portal
49+
50+
-->

articles/azure-arc/data/version-log.md

Lines changed: 26 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -19,6 +19,32 @@ ms.topic: conceptual
1919

2020
This article identifies the component versions with each release of Azure Arc-enabled data services.
2121

22+
## December 12, 2023
23+
24+
|Component|Value|
25+
|-----------|-----------|
26+
|Container images tag |`v1.26.0_2023-12-12`|
27+
|**CRD names and version:**| |
28+
|`activedirectoryconnectors.arcdata.microsoft.com`| v1beta1, v1beta2, v1, v2|
29+
|`datacontrollers.arcdata.microsoft.com`| v1beta1, v1 through v5|
30+
|`exporttasks.tasks.arcdata.microsoft.com`| v1beta1, v1, v2|
31+
|`failovergroups.sql.arcdata.microsoft.com`| v1beta1, v1beta2, v1, v2|
32+
|`kafkas.arcdata.microsoft.com`| v1beta1 through v1beta4|
33+
|`monitors.arcdata.microsoft.com`| v1beta1, v1, v3|
34+
|`postgresqls.arcdata.microsoft.com`| v1beta1 through v1beta6|
35+
|`postgresqlrestoretasks.tasks.postgresql.arcdata.microsoft.com`| v1beta1|
36+
|`sqlmanagedinstances.sql.arcdata.microsoft.com`| v1beta1, v1 through v13|
37+
|`sqlmanagedinstancemonitoringprofiles.arcdata.microsoft.com`| v1beta1, v1beta2|
38+
|`sqlmanagedinstancereprovisionreplicatasks.tasks.sql.arcdata.microsoft.com`| v1beta1|
39+
|`sqlmanagedinstancerestoretasks.tasks.sql.arcdata.microsoft.com`| v1beta1, v1|
40+
|`telemetrycollectors.arcdata.microsoft.com`| v1beta1 through v1beta5|
41+
|`telemetryrouters.arcdata.microsoft.com`| v1beta1 through v1beta5|
42+
|Azure Resource Manager (ARM) API version|2023-11-01-preview|
43+
|`arcdata` Azure CLI extension version|1.5.8 ([Download](https://aka.ms/az-cli-arcdata-ext))|
44+
|Arc-enabled Kubernetes helm chart extension version|1.26.0|
45+
|Azure Arc Extension for Azure Data Studio<br/>`arc`<br/>`azcli`|<br/>1.8.0 ([Download](https://aka.ms/ads-arcdata-ext))</br>1.8.0 ([Download](https://aka.ms/ads-azcli-ext))|
46+
|SQL Database version | 957 |
47+
2248
## November 14, 2023
2349

2450
|Component|Value|

articles/azure-monitor/agents/azure-monitor-agent-migration-tools.md

Lines changed: 54 additions & 62 deletions
Original file line numberDiff line numberDiff line change
@@ -4,8 +4,8 @@ description: This article describes various migration tools and helpers availabl
44
ms.topic: conceptual
55
author: guywi-ms
66
ms.author: guywild
7-
ms.reviewer: shseth
8-
ms.date: 1/18/2023
7+
ms.reviewer: jeffwo
8+
ms.date: 12/12/2023
99
ms.custom:
1010
# Customer intent: As an Azure account administrator, I want to use the available Azure Monitor tools to migrate from Log Analytics Agent to Azure Monitor Agent and track the status of the migration in my account.
1111
---
@@ -42,75 +42,67 @@ Use the DCR Config Generator tool to parse Log Analytics Agent configuration fro
4242
> [!NOTE]
4343
> DCR Config Generator does not currently support additional configuration for [Azure solutions or services](./azure-monitor-agent-migration.md#migrate-additional-services-and-features) dependent on Log Analytics Agent.
4444
45-
### Prerequisites
46-
To install DCR Config Generator, you need:
45+
### Prerequisites\Setup
46+
1. `Powershell version 7.1.3` or higher is recommended (minimum version 5.1)
47+
2. Uses `Az Powershell module` to pull workspace agent configuration information [Az PowerShell module](https://learn.microsoft.com/powershell/azure/install-azps-windows?view=azps-11.0.0&tabs=powershell&pivots=windows-psgallery)
48+
3. User will need Read/Write access to the specified workspace resource
49+
4. Connect-AzAccount and Select-AzSubscription will be used to set the context for the script to run so proper Azure credentials will be needed
4750

48-
1. PowerShell version 5.1 or higher. We recommend using PowerShell version 7.1.3 or higher.
49-
1. Read access for the specified workspace resources.
50-
1. The `Az Powershell` module to pull workspace agent configuration information. Make sure `Az.Accounts` and `Az.OperationalInsights` modules are installed.
51-
1. The Azure credentials for running `Connect-AzAccount` and `Select-AzContext`, which set the context for the script to run.
52-
53-
To install DCR Config Generator:
51+
### To install DCR Config Generator:
5452

5553
1. [Download the PowerShell script](https://github.com/microsoft/AzureMonitorCommunity/tree/master/Azure%20Services/Azure%20Monitor/Agents/Migration%20Tools/DCR%20Config%20Generator).
56-
57-
1. Run the script:
58-
59-
Option 1: Outputs **ready-to-deploy ARM template files** only, which creates the generated DCR in the specified subscription and resource group, when deployed.
60-
61-
```powershell
62-
.\WorkspaceConfigToDCRMigrationTool.ps1 -SubscriptionId $subId -ResourceGroupName $rgName -WorkspaceName $workspaceName -DCRName $dcrName -Location $location -FolderPath $folderPath
63-
```
64-
Option 2: Outputs **ready-to-deploy ARM template files** and **the DCR JSON files** separately for you to deploy via other means. You need to set the `GetDcrPayload` parameter.
65-
66-
```powershell
67-
.\WorkspaceConfigToDCRMigrationTool.ps1 -SubscriptionId $subId -ResourceGroupName $rgName -WorkspaceName $workspaceName -DCRName $dcrName -Location $location -FolderPath $folderPath -GetDcrPayload
68-
```
69-
70-
**Parameters**
71-
72-
| Parameter | Required? | Description |
73-
|------|------|------|
74-
| `SubscriptionId` | Yes | ID of the subscription that contains the target workspace. |
75-
| `ResourceGroupName` | Yes | Resource group that contains the target workspace. |
76-
| `WorkspaceName` | Yes | Name of the target workspace. |
77-
| `DCRName` | Yes | Name of the new DCR. |
78-
| `Location` | Yes | Region location for the new DCR. |
79-
| `GetDcrPayload` | No | When set, it generates additional DCR JSON files
80-
| `FolderPath` | No | Path in which to save the ARM template files and JSON files (optional). By default, Azure Monitor uses the current directory. |
81-
82-
1. Review the output ARM template files. The script can produce two types of ARM template files, depending on the agent configuration in the target workspace:
83-
84-
- Windows ARM template and parameter files - if the target workspace contains Windows performance counters or Windows events.
85-
- Linux ARM template and parameter files - if the target workspace contains Linux performance counters or Linux Syslog events.
86-
87-
If the Log Analytics workspace wasn't [configured to collect data](./log-analytics-agent.md#data-collected) from connected agents, the generated files will be empty. This is a scenario in which the agent was connected to a Log Analytics workspace, but wasn't configured to send any data from the host machine.
88-
89-
1. Deploy the generated ARM templates:
90-
91-
92-
### [Portal](#tab/portal-1)
93-
1. In the portal's search box, type in *template* and then select **Deploy a custom template**.
54+
1. Run the script using the sample parameters below:
55+
56+
```powershell
57+
.\WorkspaceConfigToDCRMigrationTool.ps1 -SubscriptionId $subId -ResourceGroupName $rgName -WorkspaceName $workspaceName -DCRName $dcrName -OutputFolder $outputFolderPath
58+
```
59+
---
60+
61+
| Name | Required | Description |
62+
|:----------------------- |:---------|:-----------------------------------------------------------------------------|
63+
| `SubscriptionId` | YES | This is the subscription ID of the workspace |
64+
| `ResourceGroupName` | YES | This is the resource Group of the workspace |
65+
| `WorkspaceName` | YES | This is the name of the workspace (Azure resource IDs are case insensitive) |
66+
| `DCRName` | YES | The base name that will be used for each one the outputs DCRs |
67+
| `OutputFolder` | NO | The output folder path. If not provided, the working directory path is used |
68+
69+
3. Outputs:
70+
71+
For each supported `DCR type`, the script produces a DCR ARM template (ready to be deployed) and a DCR payload (for users that don't need the ARM template). This is the list of currently supported DCR types:
72+
- **Windows** contains `WindowsPerfCounters` and `WindowsEventLogs` data sources only
73+
- **Linux** contains `LinuxPerfCounters` and `Syslog` data sources only
74+
- **Custom Logs** contains `logFiles` data sources only:
75+
- Each custom log gets its own DCR ARM template
76+
- **IIS Logs** contains `iisLogs` data sources only
77+
- **Extensions** contains `extensions` data sources only along with any associated perfCounters data sources
78+
- `VMInsights`
79+
- If you would like to add support for a new extension type, please reach out to us.
80+
81+
82+
4. **Deploy the generated ARM templates:**
83+
84+
Portal
85+
86+
- In the portal's search box, type in *template* and then select **Deploy a custom template**.
9487

9588
:::image type="content" source="../logs/media/tutorial-workspace-transformations-api/deploy-custom-template.png" lightbox="../logs/media/tutorial-workspace-transformations-api/deploy-custom-template.png" alt-text="Screenshot of the Deploy custom template screen.":::
9689

97-
1. Select **Build your own template in the editor**.
90+
- Select **Build your own template in the editor**.
9891

9992
:::image type="content" source="../logs/media/tutorial-workspace-transformations-api/build-custom-template.png" lightbox="../logs/media/tutorial-workspace-transformations-api/build-custom-template.png" alt-text="Screenshot of the template editor.":::
10093

101-
1. Paste the generated template into the editor and select **Save**.
102-
1. On the **Custom deployment** screen, specify a **Subscription**, **Resource group**, and **Region**.
103-
1. Select **Review + create** > **Create**.
104-
105-
### [PowerShell](#tab/azure-powershell)
106-
107-
```powershell-interactive
108-
New-AzResourceGroupDeployment -ResourceGroupName <resource-group-name> -TemplateFile <path-to-template>
109-
```
110-
---
111-
112-
> [!NOTE]
113-
> You can include up to 100 'counterSpecifiers' in a data collection rule. 'samplingFrequencyInSeconds' must be between 1 and 300, inclusive.
94+
- Paste the generated template into the editor and select **Save**.
95+
- On the **Custom deployment** screen, specify a **Subscription**, **Resource group**, and **Region**.
96+
- Select **Review + create** > **Create**.
97+
98+
PowerShell
99+
100+
```powershell-interactive
101+
New-AzResourceGroupDeployment -ResourceGroupName <resource-group-name> -TemplateFile <path-to-template>
102+
```
103+
104+
> [!NOTE]
105+
> You can include up to 100 'counterSpecifiers' in a data collection rule. 'samplingFrequencyInSeconds' must be between 1 and 300, inclusive.
114106

115107
1. Associate machines to your data collection rules:
116108

0 commit comments

Comments
 (0)