Skip to content

Commit e220eed

Browse files
author
craigcaseyMSFT
committed
fix broken links from CATS report
1 parent e2f2f2f commit e220eed

10 files changed

+12
-12
lines changed

articles/active-directory/develop/active-directory-authentication-libraries.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -43,7 +43,7 @@ The Azure Active Directory Authentication Library (ADAL) v1.0 enables applicatio
4343
| JavaScript |ADAL.js |[GitHub](https://github.com/AzureAD/azure-activedirectory-library-for-js) |[GitHub](https://github.com/AzureAD/azure-activedirectory-library-for-js) |[Single-page app](https://github.com/Azure-Samples/active-directory-javascript-singlepageapp-dotnet-webapi) | |
4444
| iOS, macOS |ADAL |[GitHub](https://github.com/AzureAD/azure-activedirectory-library-for-objc/releases) |[GitHub](https://github.com/AzureAD/azure-activedirectory-library-for-objc) |[iOS app](https://docs.microsoft.com/azure/active-directory/active-directory-devquickstarts-ios) | [Reference](http://cocoadocs.org/docsets/ADAL/2.5.1/)|
4545
| Android |ADAL |[Maven](https://search.maven.org/search?q=g:com.microsoft.aad+AND+a:adal&core=gav) |[GitHub](https://github.com/AzureAD/azure-activedirectory-library-for-android) |[Android app](https://docs.microsoft.com/azure/active-directory/active-directory-devquickstarts-android) | [JavaDocs](https://javadoc.io/doc/com.microsoft.aad/adal/)|
46-
| Node.js |ADAL |[npm](https://www.npmjs.com/package/adal-node) |[GitHub](https://github.com/AzureAD/azure-activedirectory-library-for-nodejs) | [Node.js web app](https://github.com/Azure-Samples/active-directory-node-webapp-openidconnect)|[Reference](https://docs.microsoft.com/javascript/api/adal-node/?view=azure-node-latest) |
46+
| Node.js |ADAL |[npm](https://www.npmjs.com/package/adal-node) |[GitHub](https://github.com/AzureAD/azure-activedirectory-library-for-nodejs) | [Node.js web app](https://github.com/Azure-Samples/active-directory-node-webapp-openidconnect)|[Reference](https://docs.microsoft.com/javascript/api/overview/azure/activedirectory) |
4747
| Java |ADAL4J |[Maven](https://search.maven.org/#search%7Cga%7C1%7Ca%3Aadal4j%20g%3Acom.microsoft.azure) |[GitHub](https://github.com/AzureAD/azure-activedirectory-library-for-java) |[Java web app](https://github.com/Azure-Samples/active-directory-java-webapp-openidconnect) |[Reference](https://javadoc.io/doc/com.microsoft.azure/adal4j) |
4848
| Python |ADAL |[GitHub](https://github.com/AzureAD/azure-activedirectory-library-for-python) |[GitHub](https://github.com/AzureAD/azure-activedirectory-library-for-python) |[Python web app](https://github.com/Azure-Samples/active-directory-python-webapp-graphapi) |[Reference](https://adal-python.readthedocs.io/) |
4949

articles/aks/kubernetes-action.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ ms.author: atulmal
1212

1313
# GitHub Actions for deploying to Kubernetes service
1414

15-
[GitHub Actions](https://help.github.com/en/articles/about-github-actions) gives you the flexibility to build an automated software development lifecycle workflow. The Kubernetes action [azure/aks-set-context@v1]((https://github.com/Azure/aks-set-context)) facilitate deployments to Azure Kubernetes Service clusters. The action sets the target AKS cluster context, which could be used by other actions like [azure/k8s-deploy](https://github.com/Azure/k8s-deploy/tree/master), [azure/k8s-create-secret](https://github.com/Azure/k8s-create-secret/tree/master) etc. or run any kubectl commands.
15+
[GitHub Actions](https://help.github.com/en/articles/about-github-actions) gives you the flexibility to build an automated software development lifecycle workflow. The Kubernetes action [azure/aks-set-context@v1](https://github.com/Azure/aks-set-context) facilitate deployments to Azure Kubernetes Service clusters. The action sets the target AKS cluster context, which could be used by other actions like [azure/k8s-deploy](https://github.com/Azure/k8s-deploy/tree/master), [azure/k8s-create-secret](https://github.com/Azure/k8s-create-secret/tree/master) etc. or run any kubectl commands.
1616

1717
> [!IMPORTANT]
1818
> GitHub Actions is currently in beta. You must first [sign-up to join the preview](https://github.com/features/actions) using your GitHub account.

articles/azure-databricks/databricks-connect-to-data-sources.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -38,7 +38,7 @@ The following list provides the data sources in Azure that you can use with Azur
3838

3939
This link provides instructions on how to use the [Azure Cosmos DB Spark connector](https://github.com/Azure/azure-cosmosdb-spark) from Azure Databricks to access data in Azure Cosmos DB.
4040

41-
- [Azure Event Hubs](/azure/databricks/data/data-sources/azure/eventhubs-connector)
41+
- [Azure Event Hubs](/azure/event-hubs/event-hubs-spark-connector)
4242

4343
This link provides instructions on how to use the [Azure Event Hubs Spark connector](https://github.com/Azure/azure-event-hubs-spark) from Azure Databricks to access data in Azure Event Hubs.
4444

articles/azure-databricks/howto-regional-disaster-recovery.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -280,7 +280,7 @@ To create your own regional disaster recovery topology, follow these requirement
280280

281281
8. **Migrate Azure blob storage and Azure Data Lake Storage mounts**
282282

283-
Manually remount all [Azure Blob storage](/azure/databricks/data/data-sources/azure/azure-storage.html) and [Azure Data Lake Storage (Gen 2)](/azure/databricks/data/data-sources/azure/azure-datalake-gen2.html) mount points using a notebook-based solution. The storage resources would have been mounted in the primary workspace, and that has to be repeated in the secondary workspace. There is no external API for mounts.
283+
Manually remount all [Azure Blob storage](/azure/databricks/data/data-sources/azure/azure-storage) and [Azure Data Lake Storage (Gen 2)](/azure/databricks/data/data-sources/azure/azure-datalake-gen2) mount points using a notebook-based solution. The storage resources would have been mounted in the primary workspace, and that has to be repeated in the secondary workspace. There is no external API for mounts.
284284

285285
9. **Migrate cluster init scripts**
286286

articles/azure-databricks/quickstart-create-databricks-workspace-portal.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -159,7 +159,7 @@ If you do not manually terminate the cluster it will automatically stop, provide
159159

160160
## Next steps
161161

162-
In this article, you created a Spark cluster in Azure Databricks and ran a Spark job using data from Azure Open Datasets. You can also look at [Spark data sources](/azure/databricks/data/data-sources/index.html) to learn how to import data from other data sources into Azure Databricks. Advance to the next article to learn how to perform an ETL operation (extract, transform, and load data) using Azure Databricks.
162+
In this article, you created a Spark cluster in Azure Databricks and ran a Spark job using data from Azure Open Datasets. You can also look at [Spark data sources](/azure/databricks/data/data-sources/index) to learn how to import data from other data sources into Azure Databricks. Advance to the next article to learn how to perform an ETL operation (extract, transform, and load data) using Azure Databricks.
163163

164164
> [!div class="nextstepaction"]
165165
>[Extract, transform, and load data using Azure Databricks](databricks-extract-load-sql-data-warehouse.md)

articles/azure-databricks/quickstart-create-databricks-workspace-resource-manager-template.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -76,7 +76,7 @@ In this section, you create an Azure Databricks workspace using the Azure Resour
7676

7777
Select **Create cluster**. Once the cluster is running, you can attach notebooks to the cluster and run Spark jobs.
7878

79-
For more information on creating clusters, see [Create a Spark cluster in Azure Databricks](/azure/databricks/user-guide/clusters/create).
79+
For more information on creating clusters, see [Create a Spark cluster in Azure Databricks](/azure/databricks/clusters/create).
8080

8181
## Run a Spark SQL job
8282

@@ -121,7 +121,7 @@ Perform the following tasks to create a notebook in Databricks, configure the no
121121
For instructions on how to retrieve the storage account key, see [Manage your storage access keys](../storage/common/storage-account-manage.md#access-keys).
122122

123123
> [!NOTE]
124-
> You can also use Azure Data Lake Store with a Spark cluster on Azure Databricks. For instructions, see [Use Data Lake Store with Azure Databricks](/azure/databricks/data/data-sources/azure/azure-datalake-gen2.html).
124+
> You can also use Azure Data Lake Store with a Spark cluster on Azure Databricks. For instructions, see [Use Data Lake Store with Azure Databricks](/azure/databricks/data/data-sources/azure/azure-datalake-gen2).
125125
126126
4. Run a SQL statement to create a temporary table using data from the sample JSON data file, **small_radio_json.json**. In the following snippet, replace the placeholder values with your container name and storage account name. Paste the snippet in a code cell in the notebook, and then press SHIFT + ENTER. In the snippet, `path` denotes the location of the sample JSON file that you uploaded to your Azure Storage account.
127127

@@ -181,7 +181,7 @@ If you do not manually terminate the cluster it will automatically stop, provide
181181

182182
## Next steps
183183

184-
In this article, you created a Spark cluster in Azure Databricks and ran a Spark job using data in Azure storage. You can also look at [Spark data sources](/azure/databricks/data/data-sources/index.html) to learn how to import data from other data sources into Azure Databricks. You can also look at the Resource Manager template to [Create an Azure Databricks workspace with custom VNET address](https://github.com/Azure/azure-quickstart-templates/tree/master/101-databricks-workspace-with-custom-vnet-address). For the JSON syntax and properties to use in a template, see [Microsoft.Databricks/workspaces](/azure/templates/microsoft.databricks/workspaces) template reference.
184+
In this article, you created a Spark cluster in Azure Databricks and ran a Spark job using data in Azure storage. You can also look at [Spark data sources](/azure/databricks/data/data-sources/index) to learn how to import data from other data sources into Azure Databricks. You can also look at the Resource Manager template to [Create an Azure Databricks workspace with custom VNET address](https://github.com/Azure/azure-quickstart-templates/tree/master/101-databricks-workspace-with-custom-vnet-address). For the JSON syntax and properties to use in a template, see [Microsoft.Databricks/workspaces](/azure/templates/microsoft.databricks/workspaces) template reference.
185185

186186
Advance to the next article to learn how to perform an ETL operation (extract, transform, and load data) using Azure Databricks.
187187

articles/batch/large-number-tasks.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -33,7 +33,7 @@ The maximum size of the task collection that you can add in a single call depend
3333

3434
* [REST API](/rest/api/batchservice/task/addcollection)
3535
* [Python API](/python/api/azure-batch/azure.batch.operations.TaskOperations?view=azure-python)
36-
* [Node.js API](/javascript/api/azure-batch/task?view=azure-node-latest)
36+
* [Node.js API](/javascript/api/@azure/batch/task?view=azure-node-latest)
3737

3838
When using these APIs, you need to provide logic to divide the number of tasks to meet the collection limit, and to handle errors and retries in case addition of tasks fails. If a task collection is too large to add, the request generates an error and should be retried again with fewer tasks.
3939

articles/cognitive-services/Bing-Web-Search/web-search-sdk-node-quickstart.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -106,4 +106,4 @@ When you're done with this project, make sure to remove your subscription key fr
106106
107107
## See also
108108

109-
* [Azure Node SDK reference](https://docs.microsoft.com/javascript/api/azure-cognitiveservices-websearch/)
109+
* [Azure Node SDK reference](https://docs.microsoft.com/javascript/api/@azure/cognitiveservices-websearch/)

articles/cognitive-services/Content-Moderator/index.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -146,7 +146,7 @@ landingContent:
146146
- text: Java SDK
147147
url: https://docs.microsoft.com/java/api/overview/azure/cognitiveservices/client/contentmoderator?view=azure-java-stable
148148
- text: Node.js SDK
149-
url: https://docs.microsoft.com/javascript/api/azure-cognitiveservices-contentmoderator/?view=azure-node-latest
149+
url: https://docs.microsoft.com/javascript/api/@azure/cognitiveservices-contentmoderator
150150
- text: Go SDK
151151
url: https://godoc.org/github.com/Azure/azure-sdk-for-go/services/cognitiveservices/v1.0/contentmoderator
152152
- text: Azure PowerShell

articles/cognitive-services/Face/index.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -103,7 +103,7 @@ landingContent:
103103
- text: Java SDK
104104
url: https://docs.microsoft.com/java/api/overview/azure/cognitiveservices/client/faceapi?view=azure-java-stable
105105
- text: Node.js SDK
106-
url: https://docs.microsoft.com/javascript/api/azure-cognitiveservices-face/?view=azure-node-latest
106+
url: https://docs.microsoft.com/javascript/api/@azure/cognitiveservices-face
107107
- text: Go SDK
108108
url: https://godoc.org/github.com/Azure/azure-sdk-for-go/services/cognitiveservices/v1.0/face
109109
- text: iOS SDK

0 commit comments

Comments
 (0)