Skip to content

Commit 20eb80d

Browse files
authored
Merge pull request #297506 from MicrosoftDocs/repo_sync_working_branch
Confirm merge from repo_sync_working_branch to main to sync with https://github.com/MicrosoftDocs/azure-docs (branch main)
2 parents 2a5e609 + 616abe1 commit 20eb80d

File tree

9 files changed

+11
-11
lines changed

9 files changed

+11
-11
lines changed

articles/active-directory-b2c/custom-policies-series-call-rest-api.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -93,7 +93,7 @@ You need to deploy an app, which serves as your external app. Your custom policy
9393
"requestId": "requestId",
9494
"userMessage" : "The access code you entered is incorrect. Please try again.",
9595
"developerMessage" : `The provided code ${req.body.accessCode} does not match the expected code for user.`,
96-
"moreInfo" :"https://docs.microsoft.com/en-us/azure/active-directory-b2c/string-transformations"
96+
"moreInfo" :"https://learn.microsoft.com/en-us/azure/active-directory-b2c/string-transformations"
9797
};
9898
res.status(409).send(errorResponse);
9999
}
@@ -138,7 +138,7 @@ You need to deploy an app, which serves as your external app. Your custom policy
138138
"requestId": "requestId",
139139
"userMessage": "The access code you entered is incorrect. Please try again.",
140140
"developerMessage": "The provided code 54321 does not match the expected code for user.",
141-
"moreInfo": "https://docs.microsoft.com/en-us/azure/active-directory-b2c/string-transformations"
141+
"moreInfo": "https://learn.microsoft.com/en-us/azure/active-directory-b2c/string-transformations"
142142
}
143143
```
144144
Your REST service can return HTTP 4xx status code, but the value of `status` in the JSON response must be `409`.

articles/automation/python-3-packages.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -174,7 +174,7 @@ def import_package_with_dependencies (packagename):
174174
pkgname = get_packagename_from_filename(file)
175175
download_uri_for_file = resolve_download_url(pkgname, file)
176176
send_webservice_import_module_request(pkgname, download_uri_for_file)
177-
# Sleep a few seconds so we don't send too many import requests https://docs.microsoft.com/en-us/azure/azure-subscription-service-limits#azure-automation-limits
177+
# Sleep a few seconds so we don't send too many import requests https://learn.microsoft.com/en-us/azure/azure-resource-manager/management/azure-subscription-service-limits#azure-automation-limits
178178
time.sleep(10)
179179

180180
if __name__ == '__main__':

articles/azure-resource-manager/bicep/scenarios-rbac.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -93,7 +93,7 @@ When you create the role assignment resource, you need to specify a fully qualif
9393
```bicep
9494
param principalId string
9595
96-
@description('This is the built-in Contributor role. See https://docs.microsoft.com/azure/role-based-access-control/built-in-roles#contributor')
96+
@description('This is the built-in Contributor role. See https://learn.microsoft.com/azure/role-based-access-control/built-in-roles#contributor')
9797
resource contributorRoleDefinition 'Microsoft.Authorization/roleDefinitions@2022-04-01' existing = {
9898
scope: subscription()
9999
name: 'b24988ac-6180-42a0-ab88-20f7382dd24c'

articles/backup/quick-kubernetes-backup-cli.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -53,7 +53,7 @@ az dataprotection backup-policy retention-rule set --lifecycles ./retentionrule.
5353
Once the policy JSON has all the required values, proceed to create a new policy from the policy object.
5454

5555
```azurecli
56-
az dataprotection backup-policy create -g testBkpVaultRG --vault-name TestBkpVault -n mypolicy --policy policy.json
56+
az dataprotection backup-policy create -g testBkpVaultRG --vault-name TestBkpVault -n mypolicy --policy akspolicy.json
5757
```
5858

5959
## Prepare AKS cluster for backup

articles/cdn/cdn-manage-powershell.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -90,7 +90,7 @@ DESCRIPTION
9090
Gets an Azure CDN profile and all related information.
9191
9292
RELATED LINKS
93-
https://docs.microsoft.com/powershell/module/az.cdn/get-azcdnprofile
93+
https://learn.microsoft.com/powershell/module/az.cdn/get-azcdnprofile
9494
9595
REMARKS
9696
To see the examples, type: "get-help Get-AzCdnProfile -examples".

articles/cost-management-billing/costs/quick-acm-cost-analysis.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -105,7 +105,7 @@ After you customize your view to meet your needs, you might want to save and sha
105105

106106
1. Save the view on a subscription, resource group, management group, or billing account.
107107
2. Share a URL with view configuration details, which they can use on any scope they have access to.
108-
3. Ping the view to an Azure portal dashboard. Pinning requires access to the same scope.
108+
3. Pin the view to an Azure portal dashboard. Pinning requires access to the same scope.
109109
4. Download an image of the chart or summarized cost details in an Excel or CSV file.
110110
5. Subscribe to scheduled alerts on a daily, weekly, or monthly basis.
111111

articles/event-grid/auth0-log-stream-app-insights.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@ This article shows how to send Auth0 events received by Azure Event Grid to Azur
3131
// Event Grid always sends an array of data and may send more
3232
// than one event in the array. The runtime invokes this function
3333
// once for each array element, so we are always dealing with one.
34-
// See: https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-event-grid-trigger?tabs=
34+
// See: https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-event-grid-trigger?tabs=
3535
module.exports = async function (context, eventGridEvent) {
3636
context.log(typeof eventGridEvent);
3737
context.log(eventGridEvent);

articles/event-grid/auth0-log-stream-blob-storage.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -48,7 +48,7 @@ This article shows you how to send Auth0 events to Azure Blob Storage via Azure
4848
// Event Grid always sends an array of data and may send more
4949
// than one event in the array. The runtime invokes this function
5050
// once for each array element, so we are always dealing with one.
51-
// See: https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-event-grid-trigger?tabs=
51+
// See: https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-event-grid-trigger?tabs=
5252
module.exports = async function (context, eventGridEvent) {
5353
context.log(JSON.stringify(context.bindings));
5454
context.log(JSON.stringify(context.bindingData));

articles/synapse-analytics/spark/synapse-spark-sql-pool-import-export.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -531,7 +531,7 @@ dfToReadFromQueryAsOption = (spark.read
531531
# Set user's password to the database
532532
.option(Constants.PASSWORD, "<user_password>")
533533
# Set name of the data source definition that is defined with database scoped credentials.
534-
# https://docs.microsoft.com/sql/t-sql/statements/create-external-data-source-transact-sql?view=sql-server-ver15&tabs=dedicated#h-create-external-data-source-to-access-data-in-azure-storage-using-the-abfs-interface
534+
# https://learn.microsoft.com/sql/t-sql/statements/create-external-data-source-transact-sql?view=sql-server-ver15&tabs=dedicated#h-create-external-data-source-to-access-data-in-azure-storage-using-the-abfs-interface
535535
# Data extracted from the SQL query will be staged to the storage path defined on the data source's location setting.
536536
.option(Constants.DATA_SOURCE, "<data_source_name>")
537537
# Query from where data will be read.
@@ -551,7 +551,7 @@ dfToReadFromQueryAsArgument = (spark.read
551551
# Set user's password to the database
552552
.option(Constants.PASSWORD, "<user_password>")
553553
# Set name of the data source definition that is defined with database scoped credentials.
554-
# https://docs.microsoft.com/sql/t-sql/statements/create-external-data-source-transact-sql?view=sql-server-ver15&tabs=dedicated#h-create-external-data-source-to-access-data-in-azure-storage-using-the-abfs-interface
554+
# https://learn.microsoft.com/sql/t-sql/statements/create-external-data-source-transact-sql?view=sql-server-ver15&tabs=dedicated#h-create-external-data-source-to-access-data-in-azure-storage-using-the-abfs-interface
555555
# Data extracted from the SQL query will be staged to the storage path defined on the data source's location setting.
556556
.option(Constants.DATA_SOURCE, "<data_source_name>")
557557
.synapsesql("select <column_name>, count(*) as counts from <schema_name>.<table_name> group by <column_name>")

0 commit comments

Comments
 (0)