Skip to content

Commit 9c6ab60

Browse files
committed
Fixing something
1 parent 84cdb33 commit 9c6ab60

File tree

1 file changed

+3
-28
lines changed

1 file changed

+3
-28
lines changed

articles/storage/blobs/data-lake-storage-use-databricks-spark.md

Lines changed: 3 additions & 28 deletions
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,6 @@ This tutorial shows you how to connect your Azure Databricks cluster to data sto
2121
In this tutorial, you will:
2222

2323
> [!div class="checklist"]
24-
> - Create a Databricks cluster
2524
> - Ingest unstructured data into a storage account
2625
> - Run analytics on your data in Blob storage
2726
@@ -45,38 +44,14 @@ If you don't have an Azure subscription, create a [free account](https://azure.m
4544

4645
- An Azure Databricks cluster. See [Create a cluster](/azure/databricks/getting-started/quick-start#step-1-create-a-cluster).
4746

48-
### Download the flight data
47+
## Download the flight data
4948

5049
This tutorial uses flight data from the Bureau of Transportation Statistics to demonstrate how to perform an ETL operation. You must download this data to complete the tutorial.
5150

5251
1. Download the [On_Time_Reporting_Carrier_On_Time_Performance_1987_present_2016_1.zip](https://github.com/Azure-Samples/AzureStorageSnippets/blob/master/blobs/tutorials/On_Time_Reporting_Carrier_On_Time_Performance_1987_present_2016_1.zip) file. This file contains the flight data.
5352

5453
2. Unzip the contents of the zipped file and make a note of the file name and the path of the file. You need this information in a later step.
5554

56-
## Create an Azure Databricks service
57-
58-
In this section, you create an Azure Databricks service by using the Azure portal.
59-
60-
1. In the Azure portal, select **Create a resource** > **Analytics** > **Azure Databricks**.
61-
62-
![Databricks on Azure portal](./media/data-lake-storage-use-databricks-spark/azure-databricks-on-portal.png "Databricks on Azure portal")
63-
64-
2. Under **Azure Databricks Service**, provide the following values to create a Databricks service:
65-
66-
|Property |Description |
67-
|---------|---------|
68-
|**Workspace name** | Provide a name for your Databricks workspace. |
69-
|**Subscription** | From the drop-down, select your Azure subscription. |
70-
|**Resource group** | Specify whether you want to create a new resource group or use an existing one. A resource group is a container that holds related resources for an Azure solution. For more information, see [Azure Resource Group overview](../../azure-resource-manager/management/overview.md). |
71-
|**Location** | Select **West US 2**. For other available regions, see [Azure services available by region](https://azure.microsoft.com/regions/services/). |
72-
|**Pricing Tier** | Select **Standard**. |
73-
74-
![Create an Azure Databricks workspace](./media/data-lake-storage-use-databricks-spark/create-databricks-workspace.png "Create an Azure Databricks service")
75-
76-
3. The account creation takes a few minutes. To monitor the operation status, view the progress bar at the top.
77-
78-
4. Select **Pin to dashboard** and then select **Create**.
79-
8055
## Ingest data
8156

8257
### Copy source data into the storage account
@@ -113,8 +88,8 @@ In this section, you'll create a container and a folder in your storage account.
11388

11489
3. In the Workspace folder, select **Create > Notebook**.
11590

116-
> [!div class="mx-imgBorder"]
117-
> ![Screenshot of create notebook option](./media/data-lake-storage-use-databricks-spark/create-notebook.png)
91+
> [!div class="mx-imgBorder"]
92+
> ![Screenshot of create notebook option](./media/data-lake-storage-use-databricks-spark/create-notebook.png)
11893
11994
4. In the **Create Notebook** dialog, enter a name and then select **Python** in the **Default Language** drop-down list. This selection determines the default language of the notebook.
12095

0 commit comments

Comments
 (0)