Skip to content

Commit 04a67d7

Browse files
committed
Merge branch 'master' of https://github.com/MicrosoftDocs/azure-docs-pr into unc
2 parents 992d082 + 3b3bd39 commit 04a67d7

File tree

6 files changed

+22
-18
lines changed

6 files changed

+22
-18
lines changed

articles/active-directory/develop/scenario-web-app-call-api-app-configuration.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,7 @@ public void ConfigureServices(IServiceCollection services)
4646
{
4747
// more code here
4848
49-
services.AddSignIn(Configuration, "AzureAd");
49+
services.AddSignIn(Configuration, "AzureAd")
5050
.AddWebAppCallsProtectedWebApi(Configuration,
5151
initialScopes: new string[] { "user.read" })
5252
.AddInMemoryTokenCaches();

articles/backup/azure-file-share-support-matrix.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ Backup for Azure file shares is available in the following GEOS:
1515

1616
| GA regions | Supported regions but not GA |
1717
| ------------------------------------------------------------ | ------------------------------------------------------------ |
18-
| Australia South East (ASE), Canada Central (CNC), West Central US (WCUS), West US 2 (WUS 2), India South (INS), North Central US (NCUS), Japan East (JPE), Brazil South (BRS), South East Asia (SEA) |Australia East (AE), Canada East (CE), East Asia (EA), East US (EUS), East US 2 (EUS2), Japan West (JPW), India Central (INC), Korea Central (KRC), Korea South (KRS), North Europe (NE), South Central US (SCUS), UK South (UKS), UK West (UKW), West Europe (WE), West US (WUS), US Gov Arizona (UGA), US Gov Texas (UGT), US Gov Virginia (UGV), Australia Central (ACL), India West (INW), South Africa North (SAN), UAE North(UAN), France Central (FRC), Germany North (GN), Germany West Central (GWC), South Africa West (SAW), UAE Central (UAC), Norway East (NWE), Norway West (NWW), Switzerland North (SZN) , Central US (CUS) |
18+
| Australia South East (ASE), Canada Central (CNC), West Central US (WCUS), West US 2 (WUS 2), India South (INS), North Central US (NCUS), Japan East (JPE), Brazil South (BRS), South East Asia (SEA),Switzerland West (SZW), UAE Central (UAC), Norway East (NWE),India West (INW), Australia Central (ACL) |Australia East (AE), Canada East (CE), East Asia (EA), East US (EUS), East US 2 (EUS2), Japan West (JPW), India Central (INC), Korea Central (KRC), Korea South (KRS), North Europe (NE), South Central US (SCUS), UK South (UKS), UK West (UKW), West Europe (WE), West US (WUS), US Gov Arizona (UGA), US Gov Texas (UGT), US Gov Virginia (UGV), South Africa North (SAN), UAE North(UAN), France Central (FRC), Germany North (GN), Germany West Central (GWC), South Africa West (SAW), Norway West (NWW), Switzerland North (SZN) , Central US (CUS) |
1919

2020
## Supported storage accounts
2121

articles/hdinsight/hdinsight-storage-sharedaccesssignature-permissions.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -6,8 +6,8 @@ ms.author: hrasheed
66
ms.reviewer: jasonh
77
ms.service: hdinsight
88
ms.topic: conceptual
9-
ms.custom: hdinsightactive
10-
ms.date: 04/14/2020
9+
ms.custom: hdinsightactive,seoapr2020
10+
ms.date: 04/28/2020
1111
---
1212

1313
# Use Azure Storage Shared Access Signatures to restrict access to data in HDInsight
@@ -34,7 +34,7 @@ HDInsight has full access to data in the Azure Storage accounts associated with
3434

3535
* If using C#, Visual Studio must be version 2013 or higher.
3636

37-
* The [URI scheme](./hdinsight-hadoop-linux-information.md#URI-and-scheme) for your storage account. This scheme would be `wasb://` for Azure Storage, `abfs://` for Azure Data Lake Storage Gen2 or `adl://` for Azure Data Lake Storage Gen1. If secure transfer is enabled for Azure Storage, the URI would be `wasbs://`. See also, [secure transfer](../storage/common/storage-require-secure-transfer.md).
37+
* The URI scheme for your storage account. This scheme would be `wasb://` for Azure Storage, `abfs://` for Azure Data Lake Storage Gen2 or `adl://` for Azure Data Lake Storage Gen1. If secure transfer is enabled for Azure Storage, the URI would be `wasbs://`.
3838

3939
* An existing HDInsight cluster to add a Shared Access Signature to. If not, you can use Azure PowerShell to create a cluster and add a Shared Access Signature during cluster creation.
4040

@@ -430,5 +430,5 @@ Use the following steps to verify that you can only read and list items on the S
430430
431431
Now that you've learned how to add limited-access storage to your HDInsight cluster, learn other ways to work with data on your cluster:
432432
433-
* [Use Apache Hive with HDInsight](hadoop/hdinsight-use-hive.md)
434-
* [Use MapReduce with HDInsight](hadoop/hdinsight-use-mapreduce.md)
433+
* [Use SSH with HDInsight](hdinsight-hadoop-linux-use-ssh-unix.md)
434+
* [Authorize users for Apache Ambari Views](hdinsight-authorize-users-to-ambari.md)

articles/iot-hub/quickstart-send-telemetry-node.md

Lines changed: 13 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -79,19 +79,19 @@ A device must be registered with your IoT hub before it can connect. In this qui
7979
8080
You'll use this value later in the quickstart.
8181
82-
1. You also need a _service connection string_ to enable the back-end application to connect to your IoT hub and retrieve the messages. The following command retrieves the service connection string for your IoT hub:
82+
1. You also need the _Event Hubs-compatible endpoint_, _Event Hubs-compatible path_, and _service primary key_ from your IoT hub to enable the back-end application to connect to your IoT hub and retrieve the messages. The following commands retrieve these values for your IoT hub:
8383
84-
**YourIoTHubName**: Replace this placeholder below with the name you chose for your IoT hub.
84+
**YourIoTHubName**: Replace this placeholder below with the name you choose for your IoT hub.
8585
8686
```azurecli-interactive
87-
az iot hub show-connection-string --name {YourIoTHubName} --policy-name service --output table
88-
```
87+
az iot hub show --query properties.eventHubEndpoints.events.endpoint --name {YourIoTHubName}
8988
90-
Make a note of the service connection string, which looks like:
89+
az iot hub show --query properties.eventHubEndpoints.events.path --name {YourIoTHubName}
9190
92-
`HostName={YourIoTHubName}.azure-devices.net;SharedAccessKeyName=service;SharedAccessKey={YourSharedAccessKey}`
91+
az iot hub policy show --name service --query primaryKey --hub-name {YourIoTHubName}
92+
```
9393
94-
You'll use this value later in the quickstart. This service connection string is different from the device connection string you noted in the previous step.
94+
Make a note of these three values, which you'll use later in the quickstart.
9595
9696
## Send simulated telemetry
9797
@@ -120,9 +120,13 @@ The back-end application connects to the service-side **Events** endpoint on you
120120
121121
1. Open another local terminal window, navigate to the root folder of the sample Node.js project. Then navigate to the **iot-hub\Quickstarts\read-d2c-messages** folder.
122122
123-
1. Open the **ReadDeviceToCloudMessages.js** file in a text editor of your choice.
123+
1. Open the **ReadDeviceToCloudMessages.js** file in a text editor of your choice. Update the following variables and save your changes to the file.
124124
125-
Replace the value of the `connectionString` variable with the service connection string you made a note of earlier. Then save your changes to **ReadDeviceToCloudMessages.js**.
125+
| Variable | Value |
126+
| -------- | ----------- |
127+
| `eventHubsCompatibleEndpoint` | Replace the value of the variable with the Event Hubs-compatible endpoint you made a note of earlier. |
128+
| `eventHubsCompatiblePath` | Replace the value of the variable with the Event Hubs-compatible path you made a note of earlier. |
129+
| `iotHubSasKey` | Replace the value of the variable with the service primary key you made a note of earlier. |
126130
127131
1. In the local terminal window, run the following commands to install the required libraries and run the back-end application:
128132

articles/machine-learning/how-to-access-data.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -142,7 +142,7 @@ If your file share is in virtual network, set `skip_validation=True` using [`reg
142142

143143
#### Azure Data Lake Storage Generation 2
144144

145-
For an Azure Data Lake Storage Generation 2 (ADLS Gen 2) datastore, use [register_azure_data_lake_gen2()](https://docs.microsoft.com/python/api/azureml-core/azureml.core.datastore.datastore?view=azure-ml-py#register-azure-data-lake-gen2-workspace--datastore-name--filesystem--account-name--tenant-id--client-id--client-secret--resource-url-none--authority-url-none--protocol-none--endpoint-none--overwrite-false-) to register a credential datastore connected to an Azure DataLake Gen 2 storage with [service principal permissions](https://docs.microsoft.com/azure/active-directory/develop/howto-create-service-principal-portal). In order to utilize your service principal you need to [register your application](https://docs.microsoft.com/azure/active-directory/develop/app-objects-and-service-principals) and grant the service principal with the right data access. Learn more about [access control set up for ADLS Gen 2](https://docs.microsoft.com/azure/storage/blobs/data-lake-storage-access-control).
145+
For an Azure Data Lake Storage Generation 2 (ADLS Gen 2) datastore, use [register_azure_data_lake_gen2()](https://docs.microsoft.com/python/api/azureml-core/azureml.core.datastore.datastore?view=azure-ml-py#register-azure-data-lake-gen2-workspace--datastore-name--filesystem--account-name--tenant-id--client-id--client-secret--resource-url-none--authority-url-none--protocol-none--endpoint-none--overwrite-false-) to register a credential datastore connected to an Azure DataLake Gen 2 storage with [service principal permissions](https://docs.microsoft.com/azure/active-directory/develop/howto-create-service-principal-portal). In order to utilize your service principal you need to [register your application](https://docs.microsoft.com/azure/active-directory/develop/app-objects-and-service-principals) and grant the service principal with *Storage Blob Data Owner* access. Learn more about [access control set up for ADLS Gen 2](https://docs.microsoft.com/azure/storage/blobs/data-lake-storage-access-control).
146146

147147
The following code creates and registers the `adlsgen2_datastore_name` datastore to the `ws` workspace. This datastore accesses the file system `test` on the `account_name` storage account, by using the provided service principal credentials.
148148

articles/machine-learning/reference-pipeline-yaml.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -362,7 +362,7 @@ pipeline:
362362

363363
| YAML key | Description |
364364
| ----- | ----- |
365-
| `steps` | Sequence of one or more PipelineStep definitions. Note that the `destination` of one step's `outputs` become the keys to the `inputs` of the .|
365+
| `steps` | Sequence of one or more PipelineStep definitions. Note that the `destination` keys of one step's `outputs` become the `source` keys to the `inputs` of the next step.|
366366

367367
```yaml
368368
pipeline:

0 commit comments

Comments
 (0)