Skip to content

Commit 856a3ef

Browse files
committed
Fixing some validation issues
1 parent 7cbf55e commit 856a3ef

File tree

3 files changed

+21
-24
lines changed

3 files changed

+21
-24
lines changed

articles/storage/blobs/data-lake-storage-events.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -51,7 +51,7 @@ First, create a csv file that describes a sales order, and then upload that file
5151
2. Select **Storage browser**->**Blob containers**->**Add container** and create a new container named **data**.
5252

5353
> [!div class="mx-imgBorder"]
54-
> ![Screenshot of creating a folder in storage browser](./media/data-lake-storage-events/data-container.png)
54+
> ![Screenshot of creating a folder in storage browser](./media/data-lake-storage-events/data-container.png).
5555
5656
3. In the **data** container, create a directory named **input**.
5757

@@ -253,9 +253,9 @@ Create an Azure Function that runs the Job.
253253

254254
11. Choose **Azure Event Grid Trigger**.
255255

256-
Install the **Microsoft.Azure.WebJobs.Extensions.EventGrid** extension if you're prompted to do so. If you have to install it, then you'll have to choose **Azure Event Grid Trigger** again to create the function.
256+
Install the **Microsoft.Azure.WebJobs.Extensions.EventGrid** extension if you're prompted to do so. If you have to install it, then you'll have to choose **Azure Event Grid Trigger** again to create the function.
257257

258-
The **New Function** pane appears.
258+
The **New Function** pane appears.
259259

260260
12. In the **New Function** pane, name the function **UpsertOrder**, and then select the **Create** button.
261261

articles/storage/blobs/data-lake-storage-tutorial-extract-transform-load-hive.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -38,7 +38,7 @@ If you don't have an Azure subscription, [create a free account](https://azure.m
3838

3939
- Azure SQL Database
4040

41-
You'll use Azure SQL Database as a destination data store. If you don't have a database in SQL Database, see [Create a database in Azure SQL Database in the Azure portal](/azure/azure-sql/database/single-database-create-quickstart).
41+
You use Azure SQL Database as a destination data store. If you don't have a database in SQL Database, see [Create a database in Azure SQL Database in the Azure portal](/azure/azure-sql/database/single-database-create-quickstart).
4242

4343
- Azure CLI
4444

@@ -50,7 +50,7 @@ If you don't have an Azure subscription, [create a free account](https://azure.m
5050

5151
## Download, extract and then upload the data
5252

53-
In this section, you'll download sample flight data. Then, you'll upload that data to your HDInsight cluster and then copy that data to your Data Lake Storage Gen2 account.
53+
In this section, you download sample flight data. Then, you upload that data to your HDInsight cluster and then copy that data to your Data Lake Storage Gen2 account.
5454

5555
1. Download the [On_Time_Reporting_Carrier_On_Time_Performance_1987_present_2016_1.zip](https://github.com/Azure-Samples/AzureStorageSnippets/blob/master/blobs/tutorials/On_Time_Reporting_Carrier_On_Time_Performance_1987_present_2016_1.zip) file. This file contains the flight data.
5656

@@ -59,10 +59,10 @@ In this section, you'll download sample flight data. Then, you'll upload that da
5959
```bash
6060
scp On_Time_Reporting_Carrier_On_Time_Performance_1987_present_2016_1.zip <ssh-user-name>@<cluster-name>-ssh.azurehdinsight.net:
6161
```
62-
- Replace the `<ssh-user-name>` placeholder with the SSH login for the HDInsight cluster.
62+
- Replace the `<ssh-user-name>` placeholder with the SSH username for the HDInsight cluster.
6363
- Replace the `<cluster-name>` placeholder with the name of the HDInsight cluster.
6464

65-
If you use a password to authenticate your SSH login, you're prompted for the password.
65+
If you use a password to authenticate your SSH username, you're prompted for the password.
6666

6767
If you use a public key, you might need to use the `-i` parameter and specify the path to the matching private key. For example, `scp -i ~/.ssh/id_rsa <file_name>.zip <user-name>@<cluster-name>-ssh.azurehdinsight.net:`.
6868

@@ -116,7 +116,7 @@ As part of the Apache Hive job, you import the data from the .csv file into an A
116116
nano flightdelays.hql
117117
```
118118

119-
2. Modify the following text by replace the `<container-name>` and `<storage-account-name>` placeholders with your container and storage account name. Then copy and paste the text into the nano console by using pressing the SHIFT key along with the right-mouse click button.
119+
2. Modify the following text by replacing the `<container-name>` and `<storage-account-name>` placeholders with your container and storage account name. Then copy and paste the text into the nano console by using pressing the SHIFT key along with the right-mouse select button.
120120

121121
```hiveql
122122
DROP TABLE delays_raw;
@@ -180,7 +180,7 @@ As part of the Apache Hive job, you import the data from the .csv file into an A
180180
FROM delays_raw;
181181
```
182182
183-
3. Save the file by using use CTRL+X and then type `Y` when prompted.
183+
3. Save the file by typing CTRL+X and then typing `Y` when prompted.
184184
185185
4. To start Hive and run the `flightdelays.hql` file, use the following command:
186186
@@ -240,11 +240,11 @@ You need the server name from SQL Database for this operation. Complete these st
240240

241241
- Replace the `<server-name>` placeholder with the logical SQL server name.
242242

243-
- Replace the `<admin-login>` placeholder with the admin login for SQL Database.
243+
- Replace the `<admin-login>` placeholder with the admin username for SQL Database.
244244

245245
- Replace the `<database-name>` placeholder with the database name
246246

247-
When you're prompted, enter the password for the SQL Database admin login.
247+
When you're prompted, enter the password for the SQL Database admin username.
248248
249249
You receive output similar to the following text:
250250

articles/storage/blobs/data-lake-storage-use-sql.md

Lines changed: 10 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -59,17 +59,15 @@ This tutorial uses flight data from the Bureau of Transportation Statistics. You
5959

6060
## Create an Azure Synapse workspace
6161

62-
1. [Create a Synapse workspace in the Azure portal](../../synapse-analytics/get-started-create-workspace.md#create-a-synapse-workspace-in-the-azure-portal).
62+
[Create a Synapse workspace in the Azure portal](../../synapse-analytics/get-started-create-workspace.md#create-a-synapse-workspace-in-the-azure-portal). As you create the workspace, use these values:
6363

64-
As you create the workspace, use these values:
65-
66-
- **Subscription**: Select the Azure subscription associated with your storage account.
67-
- **Resource group**: Select the resource group where you placed your storage account.
68-
- **Region**: Select the region of the storage account (for example, `Central US`).
69-
- **Name**: Enter a name for your Synapse workspace.
70-
- **SQL Administrator login**: Enter the administrator username for the SQL Server.
71-
- **SQL Administrator password**: Enter the administrator password for the SQL Server.
72-
- **Tag Values**: Accept the default.
64+
- **Subscription**: Select the Azure subscription associated with your storage account.
65+
- **Resource group**: Select the resource group where you placed your storage account.
66+
- **Region**: Select the region of the storage account (for example, `Central US`).
67+
- **Name**: Enter a name for your Synapse workspace.
68+
- **SQL Administrator login**: Enter the administrator username for the SQL Server.
69+
- **SQL Administrator password**: Enter the administrator password for the SQL Server.
70+
- **Tag Values**: Accept the default.
7371

7472
#### Find your Synapse SQL endpoint name (optional)
7573

@@ -84,7 +82,7 @@ To find the fully qualified server name:
8482

8583
![Full server name serverless SQL pool](../../synapse-analytics/sql/media/connect-overview/server-connect-example-sqlod.png)
8684

87-
In this tutorial, you'll use Synapse Studio to query data from the CSV file that you uploaded to the storage account.
85+
In this tutorial, you use Synapse Studio to query data from the CSV file that you uploaded to the storage account.
8886

8987
## Use Synapse Studio to explore data
9088

@@ -107,8 +105,7 @@ In this tutorial, you'll use Synapse Studio to query data from the CSV file that
107105

108106
## Clean up resources
109107

110-
When they're no longer needed, delete your Synapse Analytics workspace. The workspace tat do not have some additional dedicated SQL pools or Spark pools is not charged if you are not using it, so you will get **no billing even if you keep it**.
111-
Do not delete the resource group if you have selected the resource group where you have placed your Azure storage account.
108+
When they're no longer needed, delete the resource group and all related resources. To do so, select the resource group for the storage account and workspace, and then and select **Delete**.
112109

113110
## Next steps
114111

0 commit comments

Comments
 (0)