Skip to content

Commit a69dd32

Browse files
authored
Merge pull request #102639 from DennisLee-DennisLee/v-dele-1666440-2
1666440: Updated the Databricks quickstart.
2 parents 1eb7c4a + b8f12d0 commit a69dd32

File tree

1 file changed

+7
-17
lines changed

1 file changed

+7
-17
lines changed

articles/storage/blobs/data-lake-storage-quickstart-create-databricks-account.md

Lines changed: 7 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -6,34 +6,24 @@ ms.author: normesta
66
ms.subservice: data-lake-storage-gen2
77
ms.service: storage
88
ms.topic: quickstart
9-
ms.date: 02/15/2019
9+
ms.date: 01/28/2020
1010
ms.reviewer: jeking
1111
---
1212

13-
# Quickstart: Analyze data in Azure Data Lake Storage Gen2 by using Azure Databricks
13+
# Quickstart: Analyze data with Databricks
1414

15-
This quickstart shows you how to run an Apache Spark job using Azure Databricks to perform analytics on data stored in a storage account that has Azure Data Lake Storage Gen2 enabled.
16-
17-
As part of the Spark job, you'll analyze a radio channel subscription data to gain insights into free/paid usage based on demographics.
18-
19-
If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/) before you begin.
15+
In this quickstart, you run an Apache Spark job using Azure Databricks to perform analytics on data stored in a storage account. As part of the Spark job, you'll analyze a radio channel subscription data to gain insights into free/paid usage based on demographics.
2016

2117
## Prerequisites
2218

23-
* Create a Data Lake Gen2 storage account. See [Quickstart: Create an Azure Data Lake Storage Gen2 storage account](data-lake-storage-quickstart-create-account.md)
24-
25-
Paste the name of the storage account into a text file. You'll need it soon.
19+
* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?ref=microsoft.com&utm_source=microsoft.com&utm_medium=docs&utm_campaign=visualstudio).
2620

27-
* Create a service principal. See [How to: Use the portal to create an Azure AD application and service principal that can access resources](https://docs.microsoft.com/azure/active-directory/develop/howto-create-service-principal-portal).
21+
* The name of your Azure Data Lake Gen2 storage account. [Create an Azure Data Lake Storage Gen2 storage account](data-lake-storage-quickstart-create-account.md).
2822

29-
There's a couple of specific things that you'll have to do as you perform the steps in that article.
30-
31-
:heavy_check_mark: When performing the steps in the [Assign the application to a role](https://docs.microsoft.com/azure/active-directory/develop/howto-create-service-principal-portal#assign-the-application-to-a-role) section of the article, make sure to assign the **Storage Blob Data Contributor** role to the service principal.
23+
* The tenant ID, app ID, and password of an Azure service principal with an assigned role of **Storage Blob Data Contributor**. [Create a service principal](../../active-directory/develop/howto-create-service-principal-portal.md).
3224

3325
> [!IMPORTANT]
34-
> Make sure to assign the role in the scope of the Data Lake Storage Gen2 storage account. You can assign a role to the parent resource group or subscription, but you'll receive permissions-related errors until those role assignments propagate to the storage account.
35-
36-
:heavy_check_mark: When performing the steps in the [Get values for signing in](https://docs.microsoft.com/azure/active-directory/develop/howto-create-service-principal-portal#get-values-for-signing-in) section of the article, paste the tenant ID, app ID, and password values into a text file. You'll need those soon.
26+
> Assign the role in the scope of the Data Lake Storage Gen2 storage account. You can assign a role to the parent resource group or subscription, but you'll receive permissions-related errors until those role assignments propagate to the storage account.
3727
3828
## Create an Azure Databricks workspace
3929

0 commit comments

Comments
 (0)