Skip to content

Commit 423a543

Browse files
authored
Merge pull request #223422 from linda33wj/purview
Add Azure Databricks connector support
2 parents 73a5a45 + 8177244 commit 423a543

11 files changed

+192
-7
lines changed

articles/purview/catalog-lineage-user-guide.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ author: linda33wj
55
ms.author: jingwang
66
ms.service: purview
77
ms.topic: conceptual
8-
ms.date: 09/20/2022
8+
ms.date: 01/09/2023
99
---
1010
# Microsoft Purview Data Catalog lineage user guide
1111

@@ -44,6 +44,7 @@ Databases & storage solutions such as Oracle, Teradata, and SAP have query engin
4444

4545
|**Category**| **Data source** |
4646
|---|---|
47+
|Azure| [Azure Databricks](register-scan-azure-databricks.md)
4748
|Database| [Cassandra](register-scan-cassandra-source.md)|
4849
|| [Db2](register-scan-db2.md) |
4950
|| [Google BigQuery](register-scan-google-bigquery-source.md)|
60.4 KB
Loading
28.1 KB
Loading
54.9 KB
Loading
47.3 KB
Loading
64 KB
Loading
25.3 KB
Loading

articles/purview/microsoft-purview-connector-overview.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: jingwang
66
ms.service: purview
77
ms.subservice: purview-data-map
88
ms.topic: conceptual
9-
ms.date: 10/10/2022
9+
ms.date: 01/09/2022
1010
ms.custom: ignite-2022
1111
---
1212

@@ -30,6 +30,7 @@ The table below shows the supported capabilities for each data source. Select th
3030
|| [Azure Data Share](how-to-link-azure-data-share.md) | [Yes](how-to-link-azure-data-share.md) | No | [Yes](how-to-link-azure-data-share.md) | No | No|
3131
|| [Azure Database for MySQL](register-scan-azure-mysql-database.md) | [Yes](register-scan-azure-mysql-database.md#register) | [Yes](register-scan-azure-mysql-database.md#scan) | No* | No | No |
3232
|| [Azure Database for PostgreSQL](register-scan-azure-postgresql.md) | [Yes](register-scan-azure-postgresql.md#register) | [Yes](register-scan-azure-postgresql.md#scan) | No* | No | No |
33+
|| [Azure Databricks](register-scan-azure-databricks.md) | [Yes](register-scan-azure-databricks.md#register) | [Yes](register-scan-azure-databricks.md#scan) | [Yes](register-scan-azure-databricks.md#lineage) | No | No |
3334
|| [Azure Dedicated SQL pool (formerly SQL DW)](register-scan-azure-synapse-analytics.md)| [Yes](register-scan-azure-synapse-analytics.md#register) | [Yes](register-scan-azure-synapse-analytics.md#scan)| No* | No | No |
3435
|| [Azure Files](register-scan-azure-files-storage-source.md)|[Yes](register-scan-azure-files-storage-source.md#register) | [Yes](register-scan-azure-files-storage-source.md#scan) | Limited* | No | No |
3536
|| [Azure SQL Database](register-scan-azure-sql-database.md)| [Yes](register-scan-azure-sql-database.md#register-the-data-source) |[Yes](register-scan-azure-sql-database.md#scope-and-run-the-scan)| [Yes (Preview)](register-scan-azure-sql-database.md#extract-lineage-preview) | [Yes](register-scan-azure-sql-database.md#set-up-access-policies) (Preview) | No |
Lines changed: 182 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,182 @@
1+
---
2+
title: Connect to and manage Azure Databricks
3+
description: This guide describes how to connect to Azure Databricks in Microsoft Purview, and how to use Microsoft Purview to scan and manage your Azure Databricks source.
4+
author: linda33wj
5+
ms.author: jingwang
6+
ms.service: purview
7+
ms.subservice: purview-data-map
8+
ms.topic: how-to
9+
ms.date: 01/09/2023
10+
ms.custom: template-how-to
11+
---
12+
13+
# Connect to and manage Azure Databricks in Microsoft Purview (Preview)
14+
15+
This article outlines how to register Azure Databricks, and how to authenticate and interact with Azure Databricks in Microsoft Purview. For more information about Microsoft Purview, read the [introductory article](overview.md).
16+
17+
[!INCLUDE [feature-in-preview](includes/feature-in-preview.md)]
18+
19+
## Supported capabilities
20+
21+
|**Metadata Extraction**| **Full Scan** |**Incremental Scan**|**Scoped Scan**|**Classification**|**Access Policy**|**Lineage**|**Data Sharing**|
22+
|---|---|---|---|---|---|---|---|
23+
| [Yes](#register)| [Yes](#scan)| No | No | No | No| [Yes](#lineage) | No |
24+
25+
When scanning Azure Databricks source, Microsoft Purview supports:
26+
27+
- Extracting technical metadata including:
28+
29+
- Azure Databricks workspace
30+
- Hive server
31+
- Databases
32+
- Tables including the columns, foreign keys, unique constraints, and storage description
33+
- Views including the columns and storage description
34+
35+
- Fetching relationship between external tables and Azure Data Lake Storage Gen2/Azure Blob assets.
36+
- Fetching static lineage on assets relationships among tables and views.
37+
38+
This connector brings metadata from Databricks metastore. Comparing to scan via [Hive Metastore connector](register-scan-hive-metastore-source.md) in case you use it to scan Azure Databricks earlier:
39+
40+
- You can directly set up scan for Azure Databricks workspaces without direct HMS access. It uses Databricks personal access token for authentication and connects to a cluster to perform scan.
41+
- The Databricks workspace info is captured.
42+
- The relationship between tables and storage assets is captured.
43+
44+
## Prerequisites
45+
46+
* You must have an Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free/?WT.mc_id=A261C142F).
47+
48+
* You must have an active [Microsoft Purview account](create-catalog-portal.md).
49+
50+
* You need Data Source Administrator and Data Reader permissions to register a source and manage it in the Microsoft Purview governance portal. For more information about permissions, see [Access control in Microsoft Purview](catalog-permissions.md).
51+
52+
* Set up the latest [self-hosted integration runtime](https://www.microsoft.com/download/details.aspx?id=39717). For more information, see [Create and configure a self-hosted integration runtime](manage-integration-runtimes.md). The minimal supported elf-hosted Integration Runtime version is 5.20.8227.2.
53+
54+
* Ensure [JDK 11](https://www.oracle.com/java/technologies/downloads/#java11) is installed on the machine where the self-hosted integration runtime is installed. Restart the machine after you newly install the JDK for it to take effect.
55+
56+
* Ensure that Visual C++ Redistributable for Visual Studio 2012 Update 4 is installed on the machine where the self-hosted integration runtime is running. If you don't have this update installed, [download it now](https://www.microsoft.com/download/details.aspx?id=30679).
57+
58+
* In your Azure Databricks workspace:
59+
60+
* [Generate a personal access token](/azure/databricks/dev-tools/auth#--azure-databricks-personal-access-tokens), and store it as a secret in Azure Key Vault.
61+
* [Create a cluster](/azure/databricks/clusters/create-cluster). Note down the cluster ID - you can find it in Azure Databricks workspace -> Compute -> your cluster -> Tags -> Automatically added tags -> `ClusterId`.
62+
63+
## Register
64+
65+
This section describes how to register an Azure Databricks workspace in Microsoft Purview by using [the Microsoft Purview governance portal](https://web.purview.azure.com/).
66+
67+
1. Go to your Microsoft Purview account.
68+
69+
1. Select **Data Map** on the left pane.
70+
71+
1. Select **Register**.
72+
73+
1. In **Register sources**, select **Azure Databricks** > **Continue**.
74+
75+
1. On the **Register sources (Azure Databricks)** screen, do the following:
76+
77+
1. For **Name**, enter a name that Microsoft Purview will list as the data source.
78+
79+
1. For **Azure subscription** and **Databricks workspace name**, select the subscription and workspace that you want to scan from the dropdown. The Databricks workspace URL will be automatically populated.
80+
81+
1. For **Select a collection**, choose a collection from the list or create a new one. This step is optional.
82+
83+
:::image type="content" source="media/register-scan-azure-databricks/configure-sources.png" alt-text="Screenshot of registering Azure Databricks source." border="true":::
84+
85+
1. Select **Finish**.
86+
87+
## Scan
88+
89+
> [!TIP]
90+
> To troubleshoot any issues with scanning:
91+
> 1. Confirm you have followed all [**prerequisites**](#prerequisites).
92+
> 1. Review our [**scan troubleshooting documentation**](troubleshoot-connections.md).
93+
94+
Use the following steps to scan Azure Databricks to automatically identify assets. For more information about scanning in general, see [Scans and ingestion in Microsoft Purview](concept-scans-and-ingestion.md).
95+
96+
1. In the Management Center, select integration runtimes. Make sure that a self-hosted integration runtime is set up. If it isn't set up, use the steps in [Create and manage a self-hosted integration runtime](./manage-integration-runtimes.md).
97+
98+
1. Go to **Sources**.
99+
100+
1. Select the registered Azure Databricks.
101+
102+
1. Select **+ New scan**.
103+
104+
1. Provide the following details:
105+
106+
1. **Name**: Enter a name for the scan.
107+
108+
1. **Connect via integration runtime**: Select the configured self-hosted integration runtime.
109+
110+
1. **Credential**: Select the credential to connect to your data source. Make sure to:
111+
112+
* Select **Access Token Authentication** while creating a credential.
113+
* Provide secret name of the personal access token that you created in [Prerequisites](#prerequisites) in the appropriate box.
114+
115+
For more information, see [Credentials for source authentication in Microsoft Purview](manage-credentials.md).
116+
117+
1. **Cluster ID**: Specify the cluster ID that Microsoft Purview will connect to and perform the scan. You can find it in Azure Databricks workspace -> Compute -> your cluster -> Tags -> Automatically added tags -> `ClusterId`.
118+
119+
1. **Mount points**: Provide the mount point and Azure Storage source location string when you have external storage manually mounted to Databricks. Use the format `/mnt/<path>=abfss://<container>@<adls_gen2_storage_account>.dfs.core.windows.net/;/mnt/<path>=wasbs://<container>@<blob_storage_account>.blob.core.windows.net` It will be used to capture the relationship between tables and the corresponding storage assets in Microsoft Purview. This setting is optional, if it's not specified, such relationship won't be retrieved.
120+
121+
You can get the list of mount points in your Databricks workspace by running the following Python command in a notebook:
122+
123+
```
124+
dbutils.fs.mounts()
125+
```
126+
127+
It will print all the mount points like below:
128+
129+
```
130+
[MountInfo(mountPoint='/databricks-datasets', source='databricks-datasets', encryptionType=''),
131+
MountInfo(mountPoint='/mnt/ADLS2', source='abfss://[email protected]/', encryptionType=''),
132+
MountInfo(mountPoint='/databricks/mlflow-tracking', source='databricks/mlflow-tracking', encryptionType=''),
133+
MountInfo(mountPoint='/mnt/Blob', source='wasbs://[email protected]', encryptionType=''),
134+
MountInfo(mountPoint='/databricks-results', source='databricks-results', encryptionType=''),
135+
MountInfo(mountPoint='/databricks/mlflow-registry', source='databricks/mlflow-registry', encryptionType=''), MountInfo(mountPoint='/', source='DatabricksRoot', encryptionType='')] 
136+
```
137+
138+
In this example, specify the following as mount points:
139+
140+
`/mnt/ADLS2=abfss://[email protected]/;/mnt/Blob=wasbs://[email protected]`
141+
142+
1. **Maximum memory available**: Maximum memory (in gigabytes) available on the customer's machine for the scanning processes to use. This value is dependent on the size of Hive Metastore database to be scanned.
143+
144+
:::image type="content" source="media/register-scan-azure-databricks/scan.png" alt-text="Screenshot of setting up Azure Databricks scan." border="true":::
145+
146+
1. Select **Continue**.
147+
148+
1. For **Scan trigger**, choose whether to set up a schedule or run the scan once.
149+
150+
1. Review your scan and select **Save and Run**.
151+
152+
Once the scan successfully completes, see how to [browse and search Azure Databricks assets](#browse-and-search-assets).
153+
154+
[!INCLUDE [create and manage scans](includes/view-and-manage-scans.md)]
155+
156+
## Browse and search assets
157+
158+
After scanning your Azure Databricks, you can [browse data catalog](how-to-browse-catalog.md) or [search data catalog](how-to-search-catalog.md) to view the asset details.
159+
160+
From the Databricks workspace asset, you can find the associated Hive Metastore and the tables/views, reversed applies too.
161+
162+
:::image type="content" source="media/register-scan-azure-databricks/browse-by-source-type.png" alt-text="Screenshot of browsing assets by source type." border="true":::
163+
164+
:::image type="content" source="media/register-scan-azure-databricks/switch-to-source-asset.png" alt-text="Screenshot of navigating to Azure Databricks source asset details." border="true":::
165+
166+
:::image type="content" source="media/register-scan-azure-databricks/associated-hive-metastore.png" alt-text="Screenshot of finding the associated Hive Metastore with Azure Databricks source." border="true":::
167+
168+
## Lineage
169+
170+
Refer to the [supported capabilities](#supported-capabilities) section on the supported Azure Databricks scenarios. For more information about lineage in general, see [data lineage](concept-data-lineage.md) and [lineage user guide](catalog-lineage-user-guide.md).
171+
172+
Go to the Hive table/view asset -> lineage tab, you can see the asset relationship when applicable. For relationship between table and external storage assets, you'll see Hive Table asset and the storage asset are directly connected bi-directionally, as they mutually impact each other.
173+
174+
:::image type="content" source="media/register-scan-azure-databricks/lineage.png" alt-text="Screenshot that shows Azure Databricks lineage example." border="true":::
175+
176+
## Next steps
177+
178+
Now that you've registered your source, use the following guides to learn more about Microsoft Purview and your data:
179+
180+
- [Data Estate Insights in Microsoft Purview](concept-insights.md)
181+
- [Lineage in Microsoft Purview](catalog-lineage-user-guide.md)
182+
- [Search the data catalog](how-to-search-catalog.md)

articles/purview/register-scan-hive-metastore-source.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,6 @@ When scanning Hive metastore source, Microsoft Purview supports:
3232
- Databases
3333
- Tables including the columns, foreign keys, unique constraints, and storage description
3434
- Views including the columns and storage description
35-
- Processes
3635

3736
- Fetching static lineage on assets relationships among tables and views.
3837

0 commit comments

Comments
 (0)