Skip to content

Commit 123f315

Browse files
author
craigcaseyMSFT
committed
fix broken links from CATS report
1 parent 4fecc67 commit 123f315

File tree

10 files changed

+13
-14
lines changed

10 files changed

+13
-14
lines changed

articles/active-directory/saas-apps/google-apps-provisioning-tutorial.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ The objective of this tutorial is to demonstrate the steps to be performed in G
2828
> [!NOTE]
2929
> The G Suite connector was recently updated on October 2019. Changes made to the G Suite connector include:
3030
- Added support for additional G Suite user and group attributes.
31-
- Updated G Suite target attribute names to match what is defined [here]().
31+
- Updated G Suite target attribute names to match what is defined [here](/azure/active-directory/manage-apps/customize-application-attributes).
3232
- Updated default attribute mappings.
3333

3434
## Prerequisites

articles/azure-functions/durable/durable-functions-dotnet-entities.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -116,7 +116,7 @@ For example, we can modify the counter entity so it starts an orchestration when
116116
Class-based entities can be accessed directly, using explicit string names for the entity and its operations. We provide some examples below; for a deeper explanation of the underlying concepts (such as signals vs. calls) see the discussion in [Accessing entities](durable-functions-entities.md#accessing-entities).
117117

118118
> [!NOTE]
119-
> Where possible, we recommend [Accessing entities through interfaces](), because it provides more type checking.
119+
> Where possible, we recommend [Accessing entities through interfaces](#accessing-entities-through-interfaces), because it provides more type checking.
120120
121121
### Example: client signals entity
122122

articles/azure-monitor/platform/alerts-overview.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -180,9 +180,9 @@ The consumption and management of alert instances requires the user to have the
180180

181181
You might want to query programmatically for alerts generated against your subscription. This might be to create custom views outside of the Azure portal, or to analyze your alerts to identify patterns and trends.
182182

183-
You can query for alerts generated against your subscriptions either by using the [Alert Management REST API](https://aka.ms/alert-management-api) or by using the [Azure Resource Graph REST API for Alerts](https://docs.microsoft.com/rest/api/azureresourcegraph/resources/resources).
183+
You can query for alerts generated against your subscriptions either by using the [Alert Management REST API](https://aka.ms/alert-management-api) or by using the [Azure Resource Graph REST API for Alerts](https://docs.microsoft.com/rest/api/azureresourcegraph/resourcegraph(2018-09-01-preview)/resources/resources)).
184184

185-
The [Azure Resource Graph REST API for Alerts](https://docs.microsoft.com/rest/api/azureresourcegraph/resources/resources) allows you to query for alert instances at scale. This is recommended when you have to manage alerts generated across many subscriptions.
185+
The [Azure Resource Graph REST API for Alerts](https://docs.microsoft.com/rest/api/azureresourcegraph/resourcegraph(2018-09-01-preview)/resources/resources)) allows you to query for alert instances at scale. This is recommended when you have to manage alerts generated across many subscriptions.
186186

187187
The following sample request to the API returns the count of alerts within one subscription:
188188

articles/data-catalog/data-catalog-adopting-data-catalog.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -120,7 +120,7 @@ To learn more about the data source registration tool, see [Get started with Azu
120120
As part of the pilot project, Nancy's team also uses data sources that are described in an Excel workbook that David and is colleagues maintain. Since other teams in the enterprise also use Excel workbooks to describe data sources, the IT team decides to create a tool to migrate the Excel workbook to Data Catalog. By using the Data Catalog REST API to import existing annotations, the pilot project team can have a complete data catalog consisting of metadata extracted from the data sources using the data source registration tool, complete with information previously documented by data producers and consumers, without the need for manual re-entry. As the enterprise data catalog grows, the organization can use the data source registration tool for common data sources, and the Data Catalog API for custom sources and uncommon scenarios.
121121

122122
> [!NOTE]
123-
> We wrote a sample tool that uses the **Azure Data Catalog** API to migrate an Excel workbook to Data Catalog. To learn about the Data Catalog API and the sample tool, [download the Ad Hoc workbook code sample](https://azure.microsoft.com/documentation/samples/data-catalog-dotnet-excel-register-data-assets/), and check out the [Azure Data Catalog REST API](/rest/api/datacatalog/) documentation.
123+
> We wrote a sample tool that uses the **Azure Data Catalog** API to migrate an Excel workbook to Data Catalog. To learn about the Data Catalog API and the sample tool, [download the Ad Hoc workbook code sample](https://github.com/Azure-Samples/data-catalog-dotnet-excel-register-data-assets), and check out the [Azure Data Catalog REST API](/rest/api/datacatalog/) documentation.
124124
125125
After the pilot project is in place, it's time to execute your Data Catalog adoption plan.
126126

articles/digital-twins/security-authenticating-apis.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -44,7 +44,7 @@ When developers architect Digital Twins solutions, they typically create a middl
4444

4545
1. The acquired token is then used to authenticate with or call APIs that are further downstream using the On-Behalf-Of flow
4646

47-
For instructions about how to orchestrate the on-behalf-of flow, see [OAuth 2.0 On-Behalf-Of flow](https://docs.microsoft.com/azure/active-directory/develop/v2-oauth2-on-behalf-of-flow). You also can view code samples in [Calling a downstream web API](https://azure.microsoft.com/resources/samples/active-directory-dotnet-webapi-onbehalfof/).
47+
For instructions about how to orchestrate the on-behalf-of flow, see [OAuth 2.0 On-Behalf-Of flow](https://docs.microsoft.com/azure/active-directory/develop/v2-oauth2-on-behalf-of-flow). You also can view code samples in [Calling a downstream web API](https://github.com/Azure-Samples/active-directory-dotnet-webapi-onbehalfof).
4848

4949
## Next steps
5050

articles/governance/resource-graph/concepts/work-with-data.md

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,7 @@ az graph query -q "Resources | project name | order by name asc" --first 200 --o
3939
Search-AzGraph -Query "Resources | project name | order by name asc" -First 200
4040
```
4141

42-
In the [REST API](/rest/api/azureresourcegraph/resources/resources), the control is **$top** and is
42+
In the [REST API](/rest/api/azureresourcegraph/resourcegraph(2018-09-01-preview)/resources/resources), the control is **$top** and is
4343
part of **QueryRequestOptions**.
4444

4545
The control that is _most restrictive_ will win. For example, if your query uses the **top** or
@@ -73,14 +73,13 @@ az graph query -q "Resources | project name | order by name asc" --skip 10 --out
7373
Search-AzGraph -Query "Resources | project name | order by name asc" -Skip 10
7474
```
7575

76-
In the [REST API](/rest/api/azureresourcegraph/resources/resources), the control is **$skip** and is
76+
In the [REST API](/rest/api/azureresourcegraph/resourcegraph(2018-09-01-preview)/resources/resources), the control is **$skip** and is
7777
part of **QueryRequestOptions**.
7878

7979
## Paging results
8080

8181
When it's necessary to break a result set into smaller sets of records for processing or because a
82-
result set would exceed the maximum allowed value of _1000_ returned records, use paging. The [REST
83-
API](/rest/api/azureresourcegraph/resources/resources) **QueryResponse** provides values to
82+
result set would exceed the maximum allowed value of _1000_ returned records, use paging. The [REST API](/rest/api/azureresourcegraph/resourcegraph(2018-09-01-preview)/resources/resources) **QueryResponse** provides values to
8483
indicate of a results set has been broken up: **resultTruncated** and **$skipToken**.
8584
**resultTruncated** is a boolean value that informs the consumer if there are additional records
8685
not returned in the response. This condition can also be identified when the **count** property is

articles/hdinsight/storm/apache-troubleshoot-storm.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -133,7 +133,7 @@ For more information about using Storm event hub spout .jar files with your topo
133133
134134
### Java-based topology
135135
136-
[Process events from Azure Event Hubs with Apache Storm on HDInsight (Java)](https://docs.microsoft.com/azure/hdinsight/hdinsight-storm-develop-java-event-hub-topology)
136+
[Process events from Azure Event Hubs with Apache Storm on HDInsight (Java)](https://github.com/Azure-Samples/hdinsight-java-storm-eventhub)
137137
138138
### C#-based topology (Mono on HDInsight 3.4+ Linux Storm clusters)
139139

articles/kinect-dk/about-azure-kinect-dk.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ keywords: azure, kinect, overview, dev kit, DK, device, depth, body tracking, sp
1111

1212
# About Azure Kinect DK
1313

14-
Azure Kinect DK is a developer kit with advanced AI sensors that provide sophisticated computer vision and speech models. Kinect contains a depth sensor, spatial microphone array with a video camera, and orientation sensor as an all in-one small device with multiple modes, options, and software development kits (SDKs). It is available for purchase in [Microsoft online store](https://www.microsoft.com/p/azure-kinect-dk/8pp5vxmd9nhq?activetab=pivot:overviewtab).
14+
Azure Kinect DK is a developer kit with advanced AI sensors that provide sophisticated computer vision and speech models. Kinect contains a depth sensor, spatial microphone array with a video camera, and orientation sensor as an all in-one small device with multiple modes, options, and software development kits (SDKs). It is available for purchase in [Microsoft online store](https://www.microsoft.com/p/azure-kinect-dk/8pp5vxmd9nhq).
1515

1616
The Azure Kinect DK development environment consists of the following multiple SDKs:
1717

articles/networking/networking-partners-msp.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@ Use the links in this section for more information about managed cloud networkin
3434

3535
[Equinix](https://www.equinix.com/)
3636

37-
[InterCloud](https://www.intercloud.com/partners/cloud-service-providers/get-azure-expressroute)
37+
[InterCloud](https://intercloud.com/partners/get-to-azure-via-expressroute/)
3838

3939
[IIJ](https://www.iij.ad.jp/biz/cloudex/)
4040

articles/security/fundamentals/encryption-overview.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,7 @@ The three server-side encryption models offer different key management character
5454

5555
### Azure disk encryption
5656

57-
You can protect Windows and Linux virtual machines by using [Azure disk encryption](/azure/security/azure-security-disk-encryption), which uses [Windows BitLocker](https://technet.microsoft.com/library/cc766295(v=ws.10).aspx) technology and Linux [DM-Crypt](https://en.wikipedia.org/wiki/Dm-crypt) to protect both operating system disks and data disks with full volume encryption.
57+
You can protect Windows and Linux virtual machines by using [Azure disk encryption](/azure/security/fundamentals/azure-disk-encryption-vms-vmss), which uses [Windows BitLocker](https://technet.microsoft.com/library/cc766295(v=ws.10).aspx) technology and Linux [DM-Crypt](https://en.wikipedia.org/wiki/Dm-crypt) to protect both operating system disks and data disks with full volume encryption.
5858

5959
Encryption keys and secrets are safeguarded in your [Azure Key Vault subscription](../../key-vault/key-vault-overview.md). By using the Azure Backup service, you can back up and restore encrypted virtual machines (VMs) that use Key Encryption Key (KEK) configuration.
6060

0 commit comments

Comments
 (0)