Skip to content

Commit eb8c32b

Browse files
authored
Merge pull request #224609 from MicrosoftDocs/repo_sync_working_branch
Confirm merge from repo_sync_working_branch to main to sync with https://github.com/MicrosoftDocs/azure-docs (branch main)
2 parents e186a68 + f92da11 commit eb8c32b

File tree

11 files changed

+31
-13
lines changed

11 files changed

+31
-13
lines changed

articles/aks/ingress-tls.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -407,7 +407,7 @@ spec:
407407
To create the issuer, use the `kubectl apply` command.
408408

409409
```console
410-
kubectl apply -f cluster-issuer.yaml
410+
kubectl apply -f cluster-issuer.yaml --namespace ingress-basic
411411
```
412412

413413
## Update your ingress routes

articles/azure-monitor/alerts/alerts-create-new-alert-rule.md

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -73,7 +73,7 @@ Then you define these elements for the resulting alert actions by using:
7373

7474
1. (Optional) Depending on the signal type, you might see the **Split by dimensions** section.
7575

76-
Dimensions are name-value pairs that contain more data about the metric value. By using dimensions, you can filter the metrics and monitor specific time-series, instead of monitoring the aggregate of all the dimensional values. Dimensions can be either number or string columns.
76+
Dimensions are name-value pairs that contain more data about the metric value. By using dimensions, you can filter the metrics and monitor specific time-series, instead of monitoring the aggregate of all the dimensional values.
7777

7878
If you select more than one dimension value, each time series that results from the combination will trigger its own alert and be charged separately. For example, the transactions metric of a storage account can have an API name dimension that contains the name of the API called by each transaction (for example, GetBlob, DeleteBlob, and PutPage). You can choose to have an alert fired when there's a high number of transactions in a specific API (the aggregated data). Or you can use dimensions to alert only when the number of transactions is high for specific APIs.
7979

@@ -132,6 +132,9 @@ Then you define these elements for the resulting alert actions by using:
132132
If you select more than one dimension value, each time series that results from the combination triggers its own alert and is charged separately. The alert payload includes the combination that triggered the alert.
133133

134134
You can select up to six more splittings for any columns that contain text or numbers.
135+
136+
> [!NOTE]
137+
> Dimensions can **only** be number or string columns. If for example you want to use a dynamic column as a dimension, you need to convert it to a string first.
135138
136139
You can also decide *not* to split when you want a condition applied to multiple resources in the scope. An example would be if you want to fire an alert if at least five machines in the resource group scope have CPU usage over 80 percent.
137140

articles/cloud-services/cloud-services-dotnet-install-dotnet.md

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -25,6 +25,7 @@ To install .NET on your web and worker roles, include the .NET web installer as
2525
## Add the .NET installer to your project
2626
To download the web installer for the .NET Framework, choose the version that you want to install:
2727

28+
* [.NET Framework 4.8.1 web installer](https://go.microsoft.com/fwlink/?linkid=2215256)
2829
* [.NET Framework 4.8 Web installer](https://go.microsoft.com/fwlink/?LinkId=2150985)
2930
* [.NET Framework 4.7.2 web installer](https://go.microsoft.com/fwlink/?LinkId=863262)
3031
* [.NET Framework 4.6.2 web installer](https://dotnet.microsoft.com/download/dotnet-framework/net462)
@@ -94,6 +95,7 @@ You can use startup tasks to perform operations before a role starts. Installing
9495
REM ***** To install .NET 4.7.1 set the variable netfx to "NDP471" ***** https://go.microsoft.com/fwlink/?LinkId=852095
9596
REM ***** To install .NET 4.7.2 set the variable netfx to "NDP472" ***** https://go.microsoft.com/fwlink/?LinkId=863262
9697
REM ***** To install .NET 4.8 set the variable netfx to "NDP48" ***** https://dotnet.microsoft.com/download/thank-you/net48
98+
REM ***** To install .NET 4.8.1 set the variable netfx to "NDP481" ***** https://go.microsoft.com/fwlink/?linkid=2215256
9799
set netfx="NDP48"
98100

99101
REM ***** Set script start timestamp *****
@@ -109,6 +111,7 @@ You can use startup tasks to perform operations before a role starts. Installing
109111
set TEMP=%PathToNETFXInstall%
110112

111113
REM ***** Setup .NET filenames and registry keys *****
114+
if %netfx%=="NDP481" goto NDP481
112115
if %netfx%=="NDP48" goto NDP48
113116
if %netfx%=="NDP472" goto NDP472
114117
if %netfx%=="NDP471" goto NDP471
@@ -156,6 +159,11 @@ You can use startup tasks to perform operations before a role starts. Installing
156159
set netfxregkey="0x80EA8"
157160
goto logtimestamp
158161

162+
:NDP481
163+
set "netfxinstallfile=NDP481-Web.exe"
164+
set netfxregkey="0x82348"
165+
goto logtimestamp
166+
159167
:logtimestamp
160168
REM ***** Setup LogFile with timestamp *****
161169
md "%PathToNETFXInstall%\log"

articles/cognitive-services/openai/overview.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -96,23 +96,23 @@ The models used by the Azure OpenAI service use natural language instructions an
9696

9797
There are three main approaches for in-context learning: Few-shot, one-shot and zero-shot. These approaches vary based on the amount of task-specific data that is given to the model:
9898

99-
**Few-shot**: In this case, a user includes several examples in the call prompt that demonstrate the expected answer format and content. The following example shows a few-shot prompt where we provide multiple examples:
99+
**Few-shot**: In this case, a user includes several examples in the call prompt that demonstrate the expected answer format and content. The following example shows a few-shot prompt where we provide multiple examples (the model will generate the last answer):
100100

101101
```
102102
Convert the questions to a command:
103-
Q: Ask Constance if we need some bread
103+
Q: Ask Constance if we need some bread.
104104
A: send-msg `find constance` Do we need some bread?
105105
Q: Send a message to Greg to figure out if things are ready for Wednesday.
106106
A: send-msg `find greg` Is everything ready for Wednesday?
107-
Q: Ask Ilya if we're still having our meeting this evening
107+
Q: Ask Ilya if we're still having our meeting this evening.
108108
A: send-msg `find ilya` Are we still having a meeting this evening?
109-
Q: Contact the ski store and figure out if I can get my skis fixed before I leave on Thursday
109+
Q: Contact the ski store and figure out if I can get my skis fixed before I leave on Thursday.
110110
A: send-msg `find ski store` Would it be possible to get my skis fixed before I leave on Thursday?
111-
Q: Thank Nicolas for lunch
111+
Q: Thank Nicolas for lunch.
112112
A: send-msg `find nicolas` Thank you for lunch!
113113
Q: Tell Constance that I won't be home before 19:30 tonight — unmovable meeting.
114114
A: send-msg `find constance` I won't be home before 19:30 tonight. I have a meeting I can't move.
115-
Q: Tell John that I need to book an appointment at 10:30
115+
Q: Tell John that I need to book an appointment at 10:30.
116116
A:
117117
```
118118

articles/dms/migration-using-azure-data-studio.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -160,6 +160,8 @@ To monitor database migrations in the Azure portal:
160160

161161
- You can't use an existing self-hosted integration runtime that was created in Azure Data Factory for database migrations with Database Migration Service. Initially, create the self-hosted integration runtime by using the Azure SQL Migration extension for Azure Data Studio. You can reuse that self-hosted integration runtime in future database migrations.
162162

163+
- Azure Data Studio currently supports both Azure Active Directory (Azure AD)/Windows authentication and SQL logins for connecting to the source SQL Server instance. For the Azure SQL targets, only SQL logins are supported.
164+
163165
## Pricing
164166

165167
- Azure Database Migration Service is free to use with the Azure SQL Migration extension for Azure Data Studio. You can migrate multiple SQL Server databases by using Database Migration Service at no charge.

articles/machine-learning/concept-automated-ml.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -33,7 +33,7 @@ Using **Azure Machine Learning**, you can design and run your automated ML train
3333

3434
1. **Identify the ML problem** to be solved: classification, forecasting, regression, computer vision or NLP.
3535

36-
1. **Choose whether you want to a code-first experience or a no-code studio web experience**: Users who prefer a code-first experience can use the [AzureML SDKv2](how-to-configure-auto-train.md) or the [AzureML CLIv2](how-to-train-cli.md). Get started with [Tutorial: Train an object detection model with AutoML and Python](tutorial-auto-train-image-models.md). Users who prefer a limited/no-code experience can use the [web interface](how-to-use-automated-ml-for-ml-models.md) in Azure Machine Learning studio at [https://ml.azure.com](https://ml.azure.com/). Get started with [Tutorial: Create a classification model with automated ML in Azure Machine Learning](tutorial-first-experiment-automated-ml.md).
36+
1. **Choose whether you want a code-first experience or a no-code studio web experience**: Users who prefer a code-first experience can use the [AzureML SDKv2](how-to-configure-auto-train.md) or the [AzureML CLIv2](how-to-train-cli.md). Get started with [Tutorial: Train an object detection model with AutoML and Python](tutorial-auto-train-image-models.md). Users who prefer a limited/no-code experience can use the [web interface](how-to-use-automated-ml-for-ml-models.md) in Azure Machine Learning studio at [https://ml.azure.com](https://ml.azure.com/). Get started with [Tutorial: Create a classification model with automated ML in Azure Machine Learning](tutorial-first-experiment-automated-ml.md).
3737

3838
1. **Specify the source of the labeled training data**: You can bring your data to AzureML in [many different ways](concept-data.md).
3939

articles/machine-learning/resource-curated-environments.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -51,6 +51,9 @@ The following configurations are supported:
5151
| AzureML-ACPT-pytorch-1.11-py38-cuda11.5-gpu | Ubuntu 20.04 | cu115 | 3.8 | 1.11.0 | 1.11.1 | 0.7.3 | 1.11.0 |
5252
| AzureML-ACPT-pytorch-1.11-py38-cuda11.3-gpu | Ubuntu 20.04 | cu113 | 3.8 | 1.11.0 | 1.11.1 | 0.7.3 | 1.11.0 |
5353

54+
> [!NOTE]
55+
> Currently, due to underlying cuda and cluster incompatibilities, on [NC series](../virtual-machines/nc-series.md) only AzureML-ACPT-pytorch-1.11-py38-cuda11.3-gpu with cuda 11.3 can be used.
56+
5457
### PyTorch
5558

5659
**Name**: AzureML-pytorch-1.10-ubuntu18.04-py38-cuda11-gpu

articles/service-bus-messaging/service-bus-performance-improvements.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -323,7 +323,7 @@ There are some challenges with having a greedy approach, that is, keeping the pr
323323

324324
## Multiple queues or topics
325325

326-
If a single queue or topic can't handle the expected, use multiple messaging entities. When using multiple entities, create a dedicated client for each entity, instead of using the same client for all entities.
326+
If a single queue or topic can't handle the expected number of messages, use multiple messaging entities. When using multiple entities, create a dedicated client for each entity, instead of using the same client for all entities.
327327

328328
More queues or topics mean that you have more entities to manage at deployment time. From a scalability perspective, there really isn't too much of a difference that you would notice as Service Bus already spreads the load across multiple logs internally, so if you use six queues or topics or two queues or topics won't make a material difference.
329329

articles/storage/common/manage-storage-analytics-logs.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -44,7 +44,7 @@ You can instruct Azure Storage to save diagnostics logs for read, write, and del
4444
3. Ensure **Status** is set to **On**, and select the **services** for which you'd like to enable logging.
4545

4646
> [!div class="mx-imgBorder"]
47-
> ![Configure logging in the Azure portal.](./media/manage-storage-analytics-logs/enable-diagnostics.png)
47+
> ![Configure logging in the Azure portal.](./media/manage-storage-analytics-logs/enable-diagnostics-retention.png)
4848
4949
4. To retain logs, ensure that the **Delete data** check box is selected. Then, set the number of days that you would like log data to be retained by moving the slider control beneath the check box, or by directly modifying the value that appears in the text box next to the slider control. The default for new storage accounts is seven days. If you do not want to set a retention policy, leave the **Delete data** checkbox unchecked. If there is no retention policy, it is up to you to delete the log data.
5050

@@ -141,7 +141,7 @@ Log data can accumulate in your account over time which can increase the cost of
141141
3. Ensure that the **Delete data** check box is selected. Then, set the number of days that you would like log data to be retained by moving the slider control beneath the check box, or by directly modifying the value that appears in the text box next to the slider control.
142142

143143
> [!div class="mx-imgBorder"]
144-
> ![Modify the retention period in the Azure portal](./media/manage-storage-analytics-logs/modify-retention-period.png)
144+
> ![Modify the retention period in the Azure portal](./media/manage-storage-analytics-logs/enable-diagnostics-retention.png)
145145
146146
The default number of days for new storage accounts is seven days. If you do not want to set a retention policy, leave the **Delete data** checkbox unchecked. If there is no retention policy, it is up to you to delete the monitoring data.
147147

23.4 KB
Loading

0 commit comments

Comments
 (0)