Skip to content

Commit c162a88

Browse files
author
craigcaseyMSFT
committed
fix broken links from CATS report
1 parent 2da40a1 commit c162a88

10 files changed

+94
-94
lines changed

articles/active-directory/fundamentals/active-directory-ops-guide-auth.md

Lines changed: 24 additions & 24 deletions
Large diffs are not rendered by default.

articles/api-management/api-management-configuration-repository-git.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -218,7 +218,7 @@ The final setting, `$ref-policy`, maps to the global policy statements file for
218218
The `apis` folder contains a folder for each API in the service instance, which contains the following items.
219219

220220
* `apis\<api name>\configuration.json` - this is the configuration for the API and contains information about the backend service URL and the operations. This is the same information that would be returned if you were to call [Get a specific API](https://docs.microsoft.com/rest/api/apimanagement/2019-01-01/apis/get) with `export=true` in `application/json` format.
221-
* `apis\<api name>\api.description.html` - this is the description of the API and corresponds to the `description` property of the [API entity](https://docs.microsoft.com/java/api/com.microsoft.azure.storage.table._entity_property).
221+
* `apis\<api name>\api.description.html` - this is the description of the API and corresponds to the `description` property of the [API entity](https://docs.microsoft.com/java/api/com.microsoft.azure.storage.table.entityproperty).
222222
* `apis\<api name>\operations\` - this folder contains `<operation name>.description.html` files that map to the operations in the API. Each file contains the description of a single operation in the API, which maps to the `description` property of the [operation entity](https://docs.microsoft.com/rest/api/visualstudio/operations/list#operationproperties) in the REST API.
223223

224224
### groups folder

articles/azure-databricks/databricks-connect-to-data-sources.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -42,7 +42,7 @@ The following list provides the data sources in Azure that you can use with Azur
4242

4343
This link provides instructions on how to use the [Azure Event Hubs Spark connector](https://github.com/Azure/azure-event-hubs-spark) from Azure Databricks to access data in Azure Event Hubs.
4444

45-
- [Azure SQL Data Warehouse](/azure/databricks/data/data-sources/azure/sql-data-warehouse)
45+
- [Azure SQL Data Warehouse](/azure/synapse-analytics/sql-data-warehouse/)
4646

4747
This link provides instructions on how to use the Azure SQL Data Warehouse connector to connect from Azure Databricks.
4848

articles/azure-government/documentation-government-get-started-connect-to-storage.md

Lines changed: 59 additions & 59 deletions
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ ms.author: femila
1818

1919
# Develop with Storage API on Azure Government
2020

21-
Azure Government uses the same underlying technologies as commercial Azure, enabling you to use the development tools youre already familiar with.
21+
Azure Government uses the same underlying technologies as commercial Azure, enabling you to use the development tools you're already familiar with.
2222
To use these services in Azure Government, you must define different endpoint mappings, as shown below for the Storage service.
2323

2424
If you don't have an Azure Government subscription, create a [free account](https://azure.microsoft.com/global-infrastructure/government/request/) before you begin.
@@ -35,7 +35,7 @@ If you don't have an Azure Government subscription, create a [free account](http
3535
### Getting Started with Storage Explorer
3636
1. Open the Azure Storage Explorer desktop application.
3737

38-
2. You'll be prompted to add an Azure account; in the dropdown choose the Azure US Government option:
38+
2. You'll be prompted to add an Azure account; in the dropdown choose the "Azure US Government" option:
3939

4040
![storage1](./media/documentation-government-get-started-connect-with-storage-img1.png)
4141
3. Sign in to your Azure Government account and you can see all of your resources. The Storage Explorer should look similar to the screenshot below. Click on your Storage Account to see the blob containers, file shares, Queues, and Tables.
@@ -52,10 +52,10 @@ If you don't have an Azure Government subscription, create a [free account](http
5252
* Download Visual Studio 2019
5353

5454
### Getting Started with Storage API
55-
One important difference to note when connecting with the Storage API is that the URL for storage is different than the URL for storage in commercial Azure – specifically, the domain ends with core.usgovcloudapi.net, rather than core.windows.net.
55+
One important difference to note when connecting with the Storage API is that the URL for storage is different than the URL for storage in commercial Azure – specifically, the domain ends with "core.usgovcloudapi.net", rather than "core.windows.net".
5656

5757
These endpoint differences must be taken into account when you connect to storage in Azure Government with C#.
58-
1. Go to the [Azure Government portal](https://portal.azure.us) and select your storage account and then click the Access Keys tab:
58+
1. Go to the [Azure Government portal](https://portal.azure.us) and select your storage account and then click the "Access Keys" tab:
5959

6060
![storage4](./media/documentation-government-get-started-connect-with-storage-img4.png)
6161
2. Copy/paste the storage account connection string.
@@ -64,13 +64,13 @@ These endpoint differences must be taken into account when you connect to storag
6464
1. Open up Visual Studio and create a new project. Add a reference to the [WindowsAzure.Storage NuGet package](https://www.nuget.org/packages/WindowsAzure.Storage/). This NuGet package contains classes we will need to connect to your storage account.
6565

6666
2. Add these two lines of C# code to connect:
67-
```cs
68-
var credentials = new StorageCredentials(storageAccountName, storageAccountKey);
67+
```cs
68+
var credentials = new StorageCredentials(storageAccountName, storageAccountKey);
6969

7070
var storageAccount = new CloudStorageAccount(credentials, "core.usgovcloudapi.net", useHttps: true);  
7171
```
7272

73-
- Notice on the second line we had to use a [particular constructor for the CloudStorageAccount](https://docs.microsoft.com/java/api/com.microsoft.azure.storage._cloud_storage_account.cloudstorageaccount) – enabling us to explicitly pass in the endpoint suffix of core.usgovcloudapi.net. This constructor is the **only difference** your code requires to connect to storage in Azure Government as compared with commercial Azure.
73+
- Notice on the second line we had to use a [particular constructor for the CloudStorageAccount](https://docs.microsoft.com/java/api/com.microsoft.azure.storage.cloudstorageaccount.cloudstorageaccount) – enabling us to explicitly pass in the endpoint suffix of "core.usgovcloudapi.net". This constructor is the **only difference** your code requires to connect to storage in Azure Government as compared with commercial Azure.
7474
7575
3. At this point, we can interact with storage as we normally would. For example, if we want to retrieve a specific record from our table storage we could do it like this:
7676

@@ -109,50 +109,50 @@ These endpoint differences must be taken into account when you connect to storag
109109
this.email = email;
110110
}
111111

112-
}
112+
}
113113
```
114114
3. Create a "test" class where we'll access Azure Table Storage using the Azure Storage API.
115115
Copy and paste the code below, and **paste** your Storage Account connection string into the storageConnectionString variable.
116116

117117
```java
118118
import com.microsoft.azure.storage.*;
119119
import com.microsoft.azure.storage.table.*;
120-
120+
121121
public class test {
122122

123-
public static final String storageConnectionString = //Paste in your Storage Account connection string
124-
125-
public static void main(String[] args) {
126-
127-
try
128-
{
129-
// Retrieve storage account from connection-string.
130-
CloudStorageAccount storageAccount =
131-
CloudStorageAccount.parse(storageConnectionString);
132-
133-
// Create the table client.
134-
CloudTableClient tableClient = storageAccount.createCloudTableClient();
135-
136-
// Create the table if it doesn't exist.
137-
String tableName = "Contacts";
138-
CloudTable cloudTable = tableClient.getTableReference(tableName);
139-
cloudTable.createIfNotExists();
140-
// Create a new customer entity.
141-
CustomerEntity customer1 = new CustomerEntity("Brown", "Walter");
142-
customer1.setEmail("[email protected]");
143-
144-
// Create an operation to add the new customer to the people table.
145-
TableOperation insertCustomer1 = TableOperation.insertOrReplace(customer1);
146-
147-
// Submit the operation to the table service.
148-
cloudTable.execute(insertCustomer1);
149-
}
150-
catch (Exception e)
151-
{
152-
// Output the stack trace.
153-
e.printStackTrace();
154-
}
155-
}
123+
public static final String storageConnectionString = //Paste in your Storage Account connection string
124+
125+
public static void main(String[] args) {
126+
127+
try
128+
{
129+
// Retrieve storage account from connection-string.
130+
CloudStorageAccount storageAccount =
131+
CloudStorageAccount.parse(storageConnectionString);
132+
133+
// Create the table client.
134+
CloudTableClient tableClient = storageAccount.createCloudTableClient();
135+
136+
// Create the table if it doesn't exist.
137+
String tableName = "Contacts";
138+
CloudTable cloudTable = tableClient.getTableReference(tableName);
139+
cloudTable.createIfNotExists();
140+
// Create a new customer entity.
141+
CustomerEntity customer1 = new CustomerEntity("Brown", "Walter");
142+
customer1.setEmail("[email protected]");
143+
144+
// Create an operation to add the new customer to the people table.
145+
TableOperation insertCustomer1 = TableOperation.insertOrReplace(customer1);
146+
147+
// Submit the operation to the table service.
148+
cloudTable.execute(insertCustomer1);
149+
}
150+
catch (Exception e)
151+
{
152+
// Output the stack trace.
153+
e.printStackTrace();
154+
}
155+
}
156156
}
157157
```
158158

@@ -161,29 +161,29 @@ These endpoint differences must be taken into account when you connect to storag
161161
2. The following code below connects to Azure Blob Storage and creates a Container using the Azure Storage API.
162162
**Paste** your Azure Storage account connection string into the storageConnectionString variable below.
163163

164-
```javascript
165-
var azure = require('azure-storage');
166-
var storageConnectionString = //Paste Azure Storage connection string here
167-
var blobSvc = azure.createBlobService(storageConnectionString);
168-
blobSvc.createContainerIfNotExists('testing', function(error, result, response){
169-
if(!error){
170-
// Container exists and is private
171-
}
172-
});
164+
```javascript
165+
var azure = require('azure-storage');
166+
var storageConnectionString = //Paste Azure Storage connection string here
167+
var blobSvc = azure.createBlobService(storageConnectionString);
168+
blobSvc.createContainerIfNotExists('testing', function(error, result, response){
169+
if(!error){
170+
// Container exists and is private
171+
}
172+
});
173173
```
174174

175175
#### Python
176176
1. Download the [Azure Storage SDK for Python](https://github.com/Azure/azure-storage-python).
177177
2. When using the Storage SDK for Python to connect to Azure Government, you **must separately define an "endpoint_suffix" parameter**.
178178
**Paste** in your Azure storage account name and key in the placeholders below.
179179

180-
```python
181-
# Create the BlockBlockService that is used to call the Blob service for the storage account
182-
block_blob_service = BlockBlobService(account_name='#your account name', account_key='#your account key', endpoint_suffix="core.usgovcloudapi.net")
183-
container_name ='ml-gov-demo'
184-
generator = block_blob_service.list_blobs(container_name)
185-
for blob in generator:
186-
print(blob.name)
180+
```python
181+
# Create the BlockBlockService that is used to call the Blob service for the storage account
182+
block_blob_service = BlockBlobService(account_name='#your account name', account_key='#your account key', endpoint_suffix="core.usgovcloudapi.net")
183+
container_name ='ml-gov-demo'
184+
generator = block_blob_service.list_blobs(container_name)
185+
for blob in generator:
186+
print(blob.name)
187187
```
188188

189189
#### PHP
@@ -199,7 +199,7 @@ These endpoint differences must be taken into account when you connect to storag
199199
> You can find these endpoints by navigating to your Storage Account from the [portal](https://portal.azure.us).
200200
> **Paste** in your storage account name, key, and service endpoint in the `connectionString` variable.
201201
>
202-
202+
203203
```php
204204
<?php
205205
require_once "vendor/autoload.php";

articles/azure-government/documentation-government-services-storage.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -68,7 +68,7 @@ These are the URLs for storage accounts in Azure Government:
6868
>
6969
>
7070
71-
For more information on APIs, see the [Cloud Storage Account Constructor](https://docs.microsoft.com/java/api/com.microsoft.azure.storage._cloud_storage_account.cloudstorageaccount).
71+
For more information on APIs, see the [Cloud Storage Account Constructor](https://docs.microsoft.com/java/api/com.microsoft.azure.storage.cloudstorageaccount.cloudstorageaccount).
7272

7373
The endpoint suffix to use in these overloads is *core.usgovcloudapi.net*.
7474

articles/machine-learning/how-to-create-your-first-pipeline.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ The ML pipelines you create are visible to the members of your Azure Machine Lea
2727

2828
ML pipelines use remote compute targets for computation and the storage of the intermediate and final data associated with that pipeline. They can read and write data to and from supported [Azure Storage](https://docs.microsoft.com/azure/storage/) locations.
2929

30-
If you dont have an Azure subscription, create a free account before you begin. Try the [free or paid version of Azure Machine Learning](https://aka.ms/AMLFree).
30+
If you don't have an Azure subscription, create a free account before you begin. Try the [free or paid version of Azure Machine Learning](https://aka.ms/AMLFree).
3131

3232
## Prerequisites
3333

@@ -82,7 +82,7 @@ def_blob_store.upload_files(
8282
overwrite=True)
8383
```
8484

85-
A pipeline consists of one or more steps. A step is a unit run on a compute target. Steps might consume data sources and produce intermediate data. A step can create data such as a model, a directory with model and dependent files, or temporary data. This data is then available for other steps later in the pipeline.
85+
A pipeline consists of one or more steps. A step is a unit run on a compute target. Steps might consume data sources and produce "intermediate" data. A step can create data such as a model, a directory with model and dependent files, or temporary data. This data is then available for other steps later in the pipeline.
8686

8787
To learn more about connecting your pipeline to your data, see the articles [How to Access Data](how-to-access-data.md) and [How to Register Datasets](how-to-create-register-datasets.md).
8888

@@ -114,7 +114,7 @@ output_data1 = PipelineData(
114114

115115
If you have tabular data stored in a file or set of files, a [TabularDataset](https://docs.microsoft.com/python/api/azureml-core/azureml.data.tabulardataset?view=azure-ml-py) is an efficient alternative to a `DataReference`. `TabularDataset` objects support versioning, diffs, and summary statistics. `TabularDataset`s are lazily evaluated (like Python generators) and it's efficient to subset them by splitting or filtering. The `FileDataset` class provides similar lazily-evaluated data representing one or more files.
116116

117-
You create a `TabularDataset` using methods like [from_delimited_files](https://docs.microsoft.com/python/api/azureml-core/azureml.data.dataset_factory.tabulardatasetfactory?view=azure-ml-py#from-delimited-files-path--validate-true--include-path-false--infer-column-types-true--set-column-types-none--separator------header-true--partition-format-none-).
117+
You create a `TabularDataset` using methods like [from_delimited_files](https://docs.microsoft.com/python/api/azureml-core/azureml.data.dataset_factory.tabulardatasetfactory?view=azure-ml-py#from-delimited-files-path--validate-true--include-path-false--infer-column-types-true--set-column-types-none--separator------header-true--partition-format-none--support-multi-line-false-).
118118

119119
```python
120120
from azureml.data import TabularDataset
@@ -385,7 +385,7 @@ When you first run a pipeline, Azure Machine Learning:
385385
* Downloads the Docker image for each step to the compute target from the container registry.
386386
* Mounts the datastore if a `DataReference` object is specified in a step. If mount is not supported, the data is instead copied to the compute target.
387387
* Runs the step in the compute target specified in the step definition.
388-
* Creates artifacts, such as logs, stdout and stderr, metrics, and output specified by the step. These artifacts are then uploaded and kept in the users default datastore.
388+
* Creates artifacts, such as logs, stdout and stderr, metrics, and output specified by the step. These artifacts are then uploaded and kept in the user's default datastore.
389389

390390
![Diagram of running an experiment as a pipeline](./media/how-to-create-your-first-pipeline/run_an_experiment_as_a_pipeline.png)
391391

articles/site-recovery/azure-to-azure-common-questions.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -88,7 +88,7 @@ By using Site Recovery, you can replicate and recover VMs between any two region
8888

8989
### Does Site Recovery require internet connectivity?
9090

91-
No, Site Recovery doesn't require internet connectivity. But it does require access to Site Recovery URLs and IP ranges, as mentioned in [networking in Azure VM disaster recovery](https://docs.microsoft.com/azure/site-recovery/azure-to-azure-about-networking#outbound-connectivity-for-ip-address-ranges).
91+
No, Site Recovery doesn't require internet connectivity. But it does require access to Site Recovery URLs and IP ranges, as mentioned in [networking in Azure VM disaster recovery](https://docs.microsoft.com/azure/site-recovery/azure-to-azure-about-networking#outbound-connectivity-for-urls).
9292

9393
### Can I replicate an application that has a separate resource group for separate tiers?
9494

articles/site-recovery/azure-to-azure-troubleshoot-replication.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -76,7 +76,7 @@ We recommend creating a network service endpoint in your virtual network for "St
7676

7777
### Network connectivity
7878

79-
For Site Recovery replication to work, it needs the VM to provide outbound connectivity to specific URLs or IP ranges. You might have your VM behind a firewall or use network security group (NSG) rules to control outbound connectivity. If so, you might experience issues. To make sure all the URLs are connected, see [Outbound connectivity for Site Recovery URLs](https://docs.microsoft.com/azure/site-recovery/azure-to-azure-about-networking#outbound-connectivity-for-ip-address-ranges).
79+
For Site Recovery replication to work, it needs the VM to provide outbound connectivity to specific URLs or IP ranges. You might have your VM behind a firewall or use network security group (NSG) rules to control outbound connectivity. If so, you might experience issues. To make sure all the URLs are connected, see [Outbound connectivity for Site Recovery URLs](https://docs.microsoft.com/azure/site-recovery/azure-to-azure-about-networking#outbound-connectivity-for-urls).
8080

8181
## Error ID 153006 - No app-consistent recovery point available for the VM in the past "X" minutes
8282

articles/virtual-machines/extensions/dsc-template.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ as a String) and **RegistrationKey** (provided as a
2222
[PSCredential](/dotnet/api/system.management.automation.pscredential)) to onboard with Azure
2323
Automation. For details about obtaining those values, see [Onboarding machines for management by
2424
Azure Automation State Configuration - Secure
25-
registration](/azure/automation/automation-dsc-onboarding#secure-registration).
25+
registration](/azure/automation/automation-dsc-onboarding#onboarding-securely-using-registration).
2626

2727
> [!NOTE]
2828
> You might encounter slightly different schema examples. The change in schema occurred in the October 2016 release. For details, see [Update from a previous format](#update-from-a-previous-format).

0 commit comments

Comments
 (0)