Skip to content

Commit 410336e

Browse files
authored
Merge pull request #203713 from MicrosoftDocs/repo_sync_working_branch
Confirm merge from repo_sync_working_branch to main to sync with https://github.com/MicrosoftDocs/azure-docs (branch main)
2 parents 4147aa3 + b335af8 commit 410336e

File tree

16 files changed

+63
-31
lines changed

16 files changed

+63
-31
lines changed

articles/api-management/authorizations-how-to.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -73,7 +73,7 @@ Four steps are needed to set up an authorization with the authorization code gra
7373
1. Sign in to your GitHub account if you're prompted to do so.
7474
1. Select **Authorize** so that the application can access the signed-in user’s account.
7575

76-
:::image type="content" source="media/authorizations-how-to/consent-to-authorization.png" alt-text="Screenshot of consenting to authorize with Github.":::
76+
:::image type="content" source="media/authorizations-how-to/consent-to-authorization.png" alt-text="Screenshot of consenting to authorize with GitHub.":::
7777

7878
After authorization, the browser is redirected to API Management and the window is closed. If prompted during redirection, select **Allow access**. In API Management, select **Next**.
7979
1. On the **Access policy** page, create an access policy so that API Management has access to use the authorization. Ensure that a managed identity is configured for API Management. [Learn more about managed identities in API Management](api-management-howto-use-managed-service-identity.md#create-a-system-assigned-managed-identity).

articles/app-service/environment/using-an-ase.md

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -116,7 +116,7 @@ In an ILB ASE, the domain suffix used for app creation is *.<asename>.apps
116116

117117
For information about how to create an ILB ASE, see [Create and use an ILB ASE][MakeILBASE].
118118

119-
The SCM URL is used to access the Kudu console or for publishing your app by using Web Deploy. For information on the Kudu console, see [Kudu console for Azure App Service][Kudu]. The Kudu console gives you a web UI for debugging, uploading files, editing files, and much more.
119+
The SCM URL is used to access the Kudu console or for publishing your app by using Web Deploy. The Kudu console gives you a web UI for debugging, uploading files, editing files, and much more.
120120

121121
### DNS configuration
122122

@@ -300,7 +300,6 @@ For more specific examples, use: az find "az appservice ase"
300300
[Pricing]: https://azure.microsoft.com/pricing/details/app-service/
301301
[ARMOverview]: ../../azure-resource-manager/management/overview.md
302302
[ConfigureSSL]: ../configure-ssl-certificate.md
303-
[Kudu]: https://azure.microsoft.com/resources/videos/super-secret-kudu-debug-console-for-azure-web-sites/
304303
[AppDeploy]: ../deploy-local-git.md
305304
[ASEWAF]: ./integrate-with-application-gateway.md
306305
[AppGW]: ../../web-application-firewall/ag/ag-overview.md

articles/azure-cache-for-redis/cache-best-practices-memory-management.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ Add monitoring on memory usage to ensure that you don't run out of memory and ha
3030

3131
Configure your [maxmemory-reserved setting](cache-configure.md#memory-policies) to improve system responsiveness:
3232

33-
- A sufficient reservation setting is especially important for write-heavy workloads or if you're storing values of 100 KB or more in your cache. By default when you create a cache, 10% of the available memory is reserved for `maxmemory-reserved`. Another 10% is reserved for `maxfragmentationmemory-reserved`. You can increase the amount reserved if you have write-heavy loads.
33+
- A sufficient reservation setting is especially important for write-heavy workloads or if you're storing values of 100 KB or more in your cache. By default when you create a cache, approximately 10% of the available memory is reserved for `maxmemory-reserved`. Another 10% is reserved for `maxfragmentationmemory-reserved`. You can increase the amount reserved if you have write-heavy loads.
3434

3535
- The `maxmemory-reserved` setting configures the amount of memory, in MB per instance in a cluster, that is reserved for non-cache operations, such as replication during failover. Setting this value allows you to have a more consistent Redis server experience when your load varies. This value should be set higher for workloads that write large amounts of data. When memory is reserved for such operations, it's unavailable for storage of cached data. The allowed range for `maxmemory-reserved` is 10% - 60% of `maxmemory`. If you try to set these values lower than 10% or higher than 60%, they are re-evaluated and set to the 10% minimum and 60% maximum. The values are rendered in megabytes.
3636

articles/azure-monitor/alerts/alerts-types.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@ This table can help you decide when to use what type of alert. For more detailed
2525

2626
|Alert Type |When to Use |Pricing Information|
2727
|---------|---------|---------|
28-
|Metric alert|Metric alerts are useful when you want to be alerted about data that requires little or no manipulation. Metric data is stored in the system already pre-computed, so metric alerts are less expensive than log alerts. If the data you want to monitor is available in metric data, you would want to metric alerts.|Each metrics alert rule is charged based on the number of time-series that are monitored. |
28+
|Metric alert|Metric alerts are useful when you want to be alerted about data that requires little or no manipulation. Metric data is stored in the system already pre-computed, so metric alerts are less expensive than log alerts. If the data you want to monitor is available in metric data, using metric alerts is recommended.|Each metrics alert rule is charged based on the number of time-series that are monitored. |
2929
|Log alert|Log alerts allow you to perform advanced logic operations on your data. If the data you want to monitor is available in logs, or requires advanced logic, you can use the robust features of KQL for data manipulation using log alerts. Log alerts are more expensive than metric alerts.|Each Log Alert rule is billed based the interval at which the log query is evaluated (more frequent query evaluation results in a higher cost). Additionally, for Log Alerts configured for [at scale monitoring](#splitting-by-dimensions-in-log-alert-rules), the cost will also depend on the number of time series created by the dimensions resulting from your query. |
3030
|Activity Log alert|Activity logs provide auditing of all actions that occurred on resources. Use activity log alerts to be alerted when a specific event happens to a resource, for example, a restart, a shutdown, or the creation or deletion of a resource.|For more information, see the [pricing page](https://azure.microsoft.com/pricing/details/monitor/).|
3131

articles/azure-monitor/logs/logs-export-logic-app.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ SecurityEvent
3232
| project TimeGenerated , Account , AccountType , Computer
3333
```
3434

35-
When you export the data on a schedule, use the ingestion_time() function in your query to ensure that you don’t miss late arriving data. If data is delayed due to network or platform issues, using the ingestion time ensures that data is included in the next Logic App execution. See *Add Azure Monitor Logs action* under [Logic App procedure]](#logic-app-procedure) for an example.
35+
When you export the data on a schedule, use the ingestion_time() function in your query to ensure that you don’t miss late arriving data. If data is delayed due to network or platform issues, using the ingestion time ensures that data is included in the next Logic App execution. See *Add Azure Monitor Logs action* under [Logic App procedure](#logic-app-procedure) for an example.
3636

3737
## Prerequisites
3838
Following are prerequisites that must be completed before this procedure.

articles/azure-resource-manager/bicep/data-types.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -167,7 +167,7 @@ var myVar = 'what\'s up?'
167167
All strings in Bicep support interpolation. To inject an expression, surround it by `${` and `}`. Expressions that are referenced can't span multiple lines.
168168

169169
```bicep
170-
var storageName = 'storage${uniqueString(resourceGroup().id)}
170+
var storageName = 'storage${uniqueString(resourceGroup().id)}'
171171
```
172172

173173
## Multi-line strings

articles/azure-resource-manager/bicep/overview.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -101,7 +101,7 @@ To decompile an existing ARM template to Bicep, see [Decompiling ARM template JS
101101

102102
To learn about the resources that are available in your Bicep file, see [Bicep resource reference](/azure/templates/)
103103

104-
Bicep examples can be found in the [Bicep GitHub repo](https://github.com/Azure/bicep/tree/main/docs/examples).
104+
Bicep examples can be found in the [Bicep GitHub repo](https://github.com/Azure/azure-quickstart-templates/tree/master/quickstarts)
105105

106106
## About the language
107107

articles/azure-sql-edge/deploy-dacpac.md

Lines changed: 19 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -28,10 +28,27 @@ SQL Database dacpac and bacpac packages can be deployed to SQL Edge using the `M
2828

2929
To deploy (or import) a SQL Database DAC package `(*.dacpac)` or a BACPAC file `(*.bacpac)` using Azure Blob storage and a zip file, follow the steps below.
3030

31-
1. Create/Extract a DAC package or Export a Bacpac File using the mechanism mentioned below.
31+
1. Create/Extract a DAC package or Export a Bacpac File using one of the mechanism mentioned below.
32+
- Use [SQL Database Project Extension - Azure Data Studio](/sql/azure-data-studio/extensions/sql-database-project-extension-getting-started) to [create a new database project or export an existing database](/sql/azure-data-studio/extensions/sql-database-project-extension-getting-started)
3233
- Create or extract a SQL Database DAC package. See [Extracting a DAC from a database](/sql/relational-databases/data-tier-applications/extract-a-dac-from-a-database/) for information on how to generate a DAC package for an existing SQL Server database.
3334
- Exporting a deployed DAC package or a database. See [Export a Data-tier Application](/sql/relational-databases/data-tier-applications/export-a-data-tier-application/) for information on how to generate a bacpac file for an existing SQL Server database.
3435

36+
> [!NOTE]
37+
> If you are using external streaming jobs as part of the database, please ensure the following:
38+
>
39+
> - The generated dacpac will capture all the SQL Server Objects corresponding to the inputs/output streams and the streaming jobs. But the jobs will not be automatically started. In order to have the external streaming job automatically started after deployment, add a post-deployment script that restarts the jobs as follows:
40+
>
41+
> ```
42+
> exec sys.sp_stop_streaming_job @name=N'<JOB NAME>';
43+
> GO
44+
> exec sys.sp_start_streaming_job @name=N'<JOB NAME>';
45+
> GO
46+
> ```
47+
>
48+
> - Ensure any credentials required by the external streaming jobs to access input or output streams are provided as part of the dacpac.
49+
50+
51+
3552
2. Zip the `*.dacpac` or the `*.bacpac` file and upload it to an Azure Blob storage account. For more information on uploading files to Azure Blob storage, see [Upload, download, and list blobs with the Azure portal](../storage/blobs/storage-quickstart-blobs-portal.md).
3653
3754
3. Generate a shared access signature for the zip file by using the Azure portal. For more information, see [Delegate access with shared access signatures (SAS)](../storage/common/storage-sas-overview.md).
@@ -68,4 +85,4 @@ During some DACPAC or BACPAC deployments users may encounter a command timeouts,
6885
6986
- [Deploy SQL Edge through Azure portal](deploy-portal.md).
7087
- [Stream Data](stream-data.md)
71-
- [Machine learning and AI with ONNX in SQL Edge](onnx-overview.md)
88+
- [Machine learning and AI with ONNX in SQL Edge](onnx-overview.md)

articles/cognitive-services/language-service/conversational-language-understanding/how-to/create-project.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -99,9 +99,9 @@ You can export a Conversational Language Understanding project as a JSON file at
9999

100100
[!INCLUDE [Language Studio project details](../includes/language-studio/project-details.md)]
101101

102-
### [Rest APIs](#tab/rest-api)
102+
### [REST APIs](#tab/rest-api)
103103

104-
[!INCLUDE [Rest APIs project details](../includes/rest-api/project-details.md)]
104+
[!INCLUDE [REST APIs project details](../includes/rest-api/project-details.md)]
105105

106106
---
107107

articles/cognitive-services/language-service/custom-named-entity-recognition/how-to/create-project.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -67,9 +67,9 @@ Once your resource and storage container are configured, create a new custom NER
6767

6868
[!INCLUDE [Language Studio project creation](../includes/language-studio/create-project.md)]
6969

70-
### [Rest APIs](#tab/rest-api)
70+
### [REST APIs](#tab/rest-api)
7171

72-
[!INCLUDE [Rest APIs project creation](../includes/rest-api/create-project.md)]
72+
[!INCLUDE [REST APIs project creation](../includes/rest-api/create-project.md)]
7373

7474
---
7575

@@ -81,7 +81,7 @@ If you have already labeled data, you can use it to get started with the service
8181

8282
[!INCLUDE [Import project](../includes/language-studio/import-project.md)]
8383

84-
### [Rest APIs](#tab/rest-api)
84+
### [REST APIs](#tab/rest-api)
8585

8686
[!INCLUDE [Import project](../includes/rest-api/import-project.md)]
8787

@@ -93,9 +93,9 @@ If you have already labeled data, you can use it to get started with the service
9393

9494
[!INCLUDE [Language Studio project details](../includes/language-studio/project-details.md)]
9595

96-
### [Rest APIs](#tab/rest-api)
96+
### [REST APIs](#tab/rest-api)
9797

98-
[!INCLUDE [Rest APIs project details](../includes/rest-api/project-details.md)]
98+
[!INCLUDE [REST APIs project details](../includes/rest-api/project-details.md)]
9999

100100
---
101101

@@ -105,7 +105,7 @@ If you have already labeled data, you can use it to get started with the service
105105

106106
[!INCLUDE [Delete project using Language studio](../includes/language-studio/delete-project.md)]
107107

108-
### [Rest APIs](#tab/rest-api)
108+
### [REST APIs](#tab/rest-api)
109109

110110
[!INCLUDE [Delete project using the REST API](../includes/rest-api/delete-project.md)]
111111

0 commit comments

Comments
 (0)