Skip to content

Commit 9b9995a

Browse files
Merge pull request #106348 from craigcaseyMSFT/vcraic0303a
fix broken links from CATS report
2 parents a5dea2d + cd1d92a commit 9b9995a

File tree

10 files changed

+11
-11
lines changed

10 files changed

+11
-11
lines changed

articles/azure-monitor/log-query/query-optimization.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -59,7 +59,7 @@ Some of the query commands and functions are heavy in their CPU consumption. Thi
5959

6060
These functions consume CPU in proportion to the number of rows they are processing. The most efficient optimization is to add where conditions early in the query that can filter out as many records as possible before the CPU intensive function is executed.
6161

62-
For example, the following queries produce exactly the same result but the second one is by far the most efficient as the [where]() condition before parsing excludes many records:
62+
For example, the following queries produce exactly the same result but the second one is by far the most efficient as the [where](/azure/kusto/query/whereoperator) condition before parsing excludes many records:
6363

6464
```Kusto
6565
//less efficient

articles/data-explorer/ingest-data-streaming.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ Use the classic (bulk) ingestion instead of streaming ingestion when the amount
3232

3333
![streaming ingestion on](media/ingest-data-streaming/streaming-ingestion-on.png)
3434

35-
1. In the [Web UI](https://dataexplorer.azure.com/), define [streaming ingestion policy](/azure/kusto/concepts/streamingingestionpolicy) on table(s) or database(s) that will receive streaming data.
35+
1. In the [Web UI](https://dataexplorer.azure.com/), define [streaming ingestion policy](/azure/kusto/management/streamingingestionpolicy) on table(s) or database(s) that will receive streaming data.
3636

3737
> [!NOTE]
3838
> * If the policy is defined at the database level, all tables in the database are enabled for streaming ingestion.
@@ -58,7 +58,7 @@ There are two supported streaming ingestion types:
5858
> [!WARNING]
5959
> Disabling streaming ingestion may take a few hours.
6060
61-
1. Drop [streaming ingestion policy](/azure/kusto/concepts/streamingingestionpolicy) from all relevant tables and databases. The streaming ingestion policy removal triggers streaming ingestion data movement from the initial storage to the permanent storage in the column store (extents or shards). The data movement can last between a few seconds to a few hours, depending on the amount of data in the initial storage and how the CPU and memory is used by the cluster.
61+
1. Drop [streaming ingestion policy](/azure/kusto/management/streamingingestionpolicy) from all relevant tables and databases. The streaming ingestion policy removal triggers streaming ingestion data movement from the initial storage to the permanent storage in the column store (extents or shards). The data movement can last between a few seconds to a few hours, depending on the amount of data in the initial storage and how the CPU and memory is used by the cluster.
6262
1. In the Azure portal, go to your Azure Data Explorer cluster. In **Settings**, select **Configurations**.
6363
1. In the **Configurations** pane, select **Off** to disable **Streaming ingestion**.
6464
1. Select **Save**.

articles/germany/germany-migration-integration.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -93,7 +93,7 @@ Azure Logic Apps isn't available in Azure Germany, but you can create scheduling
9393
For more information:
9494

9595
- Learn more by completing the [Azure Logic Apps tutorials](https://docs.microsoft.com/azure/logic-apps/tutorial-build-schedule-recurring-logic-app-workflow).
96-
- Review the [Azure Logic Apps overview](https://docs.microsoft.com/azure/logic-apps/logic-apps-overview.md).
96+
- Review the [Azure Logic Apps overview](https://docs.microsoft.com/azure/logic-apps/logic-apps-overview).
9797

9898
## Next steps
9999

articles/germany/germany-migration-management-tools.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -59,7 +59,7 @@ Azure Scheduler is being retired. To create scheduling jobs, you can use [Azure
5959
For more information:
6060

6161
- Learn more by completing the [Azure Logic Apps tutorials](https://docs.microsoft.com/azure/logic-apps/tutorial-build-schedule-recurring-logic-app-workflow).
62-
- Review the [Azure Logic Apps overview](https://docs.microsoft.com/azure/logic-apps/logic-apps-overview.md).
62+
- Review the [Azure Logic Apps overview](https://docs.microsoft.com/azure/logic-apps/logic-apps-overview).
6363

6464
## Network Watcher
6565

articles/machine-learning/concept-data-ingestion.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -51,7 +51,7 @@ These steps and the following diagram illustrate Azure Data Factory's data inges
5151

5252
## Use the Python SDK
5353

54-
With the [Python SDK](https://docs.microsoft.com/python/api/overview/azureml-sdk/?view=azure-ml-py), you can incorporate data ingestion tasks into an [Azure Machine Learning pipeline](how-to-create-your-first-pipeline.md) step.
54+
With the [Python SDK](https://docs.microsoft.com/python/api/overview/azure/ml), you can incorporate data ingestion tasks into an [Azure Machine Learning pipeline](how-to-create-your-first-pipeline.md) step.
5555

5656
The following table summarizes the pros and con for using the SDK and an ML pipelines step for data ingestion tasks.
5757

articles/machine-learning/how-to-debug-pipelines.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -280,7 +280,7 @@ if not (args.output_train is None):
280280

281281
### Configure ML pipeline
282282

283-
To provide the Python packages needed to start PTVSD and get the run context, create an [environment]()
283+
To provide the Python packages needed to start PTVSD and get the run context, create an environment
284284
and set `pip_packages=['ptvsd', 'azureml-sdk==1.0.83']`. Change the SDK version to match the one you are using. The following code snippet demonstrates how to create an environment:
285285

286286
```python

articles/sql-database/sql-database-copy.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -147,7 +147,7 @@ You can use the steps in the [Copy a SQL database to a different server](#copy-a
147147
148148
### Monitor the progress of the copying operation
149149

150-
Monitor the copying process by querying the [sys.databases](https://docs.microsoft.com/sql/relational-databases/system-catalog-views/sys-databases-transact-sql), [sys.dm_database_copies](https://docs.microsoft.com/sql/relational-databases/system-dynamic-management-views/sys-dm-database-copies-azure-sql-database.md), and [sys.dm_operation_status](https://docs.microsoft.com/sql/relational-databases/system-dynamic-management-views/sys-dm-operation-status-azure-sql-database.md) views. While the copying is in progress, the **state_desc** column of the sys.databases view for the new database is set to **COPYING**.
150+
Monitor the copying process by querying the [sys.databases](https://docs.microsoft.com/sql/relational-databases/system-catalog-views/sys-databases-transact-sql), [sys.dm_database_copies](https://docs.microsoft.com/sql/relational-databases/system-dynamic-management-views/sys-dm-database-copies-azure-sql-database), and [sys.dm_operation_status](https://docs.microsoft.com/sql/relational-databases/system-dynamic-management-views/sys-dm-operation-status-azure-sql-database) views. While the copying is in progress, the **state_desc** column of the sys.databases view for the new database is set to **COPYING**.
151151

152152
* If the copying fails, the **state_desc** column of the sys.databases view for the new database is set to **SUSPECT**. Execute the DROP statement on the new database, and try again later.
153153
* If the copying succeeds, the **state_desc** column of the sys.databases view for the new database is set to **ONLINE**. The copying is complete, and the new database is a regular database that can be changed independent of the source database.

articles/storage/blobs/storage-quickstart-blobs-nodejs-legacy.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -405,4 +405,4 @@ This quickstart demonstrates how to manage blobs and containers in Azure Blob st
405405

406406
> [!div class="nextstepaction"]
407407
> [Azure Storage v10 SDK for JavaScript repository](https://github.com/Azure/azure-storage-js)
408-
> [Azure Storage JavaScript API Reference](/javascript/api/overview/azure/storage?view=azure-node-legacy)
408+
> [Azure Storage JavaScript API Reference](/javascript/api/overview/azure/storage-overview)

articles/storage/common/storage-introduction.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -137,7 +137,7 @@ Azure Storage resources can be accessed by any language that can make HTTP/HTTPS
137137
- [Azure Storage REST API](https://docs.microsoft.com/rest/api/storageservices/)
138138
- [Azure Storage client library for .NET](https://docs.microsoft.com/dotnet/api/overview/azure/storage)
139139
- [Azure Storage client library for Java/Android](https://docs.microsoft.com/java/api/overview/azure/storage)
140-
- [Azure Storage client library for Node.js](https://docs.microsoft.com/javascript/api/overview/azure/storage)
140+
- [Azure Storage client library for Node.js](https://docs.microsoft.com/javascript/api/overview/azure/storage-overview)
141141
- [Azure Storage client library for Python](https://github.com/Azure/azure-storage-python)
142142
- [Azure Storage client library for PHP](https://github.com/Azure/azure-storage-php)
143143
- [Azure Storage client library for Ruby](https://github.com/Azure/azure-storage-ruby)

includes/resource-manager-quickstart-introduction.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,4 +6,4 @@ ms.date: 02/26/2020
66
ms.author: jgao
77
---
88

9-
[Resource Manager template](/azure/azure-resource-manager/templates/overview.md) is a JavaScript Object Notation (JSON) file that defines the infrastructure and configuration for your project. The template uses declarative syntax, which lets you state what you intend to deploy without having to write the sequence of programming commands to create it. If you want to learn more about developing Resource Manager templates, see [Resource Manager documentation](/azure/azure-resource-manager/) and the [template reference](/azure/templates).
9+
[Resource Manager template](/azure/azure-resource-manager/templates/overview) is a JavaScript Object Notation (JSON) file that defines the infrastructure and configuration for your project. The template uses declarative syntax, which lets you state what you intend to deploy without having to write the sequence of programming commands to create it. If you want to learn more about developing Resource Manager templates, see [Resource Manager documentation](/azure/azure-resource-manager/) and the [template reference](/azure/templates).

0 commit comments

Comments
 (0)