You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
With the Databricks Destination, you can ingest event data from Segment into the bronze layer of your Databricks Delta Lake.
6
+
With the Databricks Delta Lake Destination, you can ingest event data from Segment into the bronze layer of your Databricks Delta Lake.
7
7
8
-
This page will help you use the Databricks Destination to sync Segment events into your Databricks Delta Lake built on AWS.
8
+
This page will help you use the Databricks Destination to sync Segment events into your Databricks Delta Lake built on S3.
9
9
10
-
> info "Databricks Delta Lake Destination in beta"
11
-
> The Databricks Delta Lake Destination is in beta, and Segment is actively working on this feature. Some functionality may change before it becomes generally available. [Contact Segment](https://segment.com/help/contact/){:target="_blank"} with any feedback or questions.
10
+
> info "Databricks Delta Lake Destination in public beta"
11
+
> The Databricks Delta Lake Destination is in public beta, and Segment is actively working on this integration. [Contact Segment](https://segment.com/help/contact/){:target="_blank"} with any feedback or questions.
12
+
13
+
## Overview
14
+
15
+
Before getting started, use the overview below to get up to familiarize yourself with Segment's Databricks Delta Lake Destination.
16
+
17
+
1. Segment writes directly to your Delta Lake in the cloud storage (S3)
18
+
- Segment manages the creation and evolution of Delta tables.
19
+
- Segment uses IAM role assumption to write Delta to AWS S3.
20
+
2. Segment supports both OAuth and personal access tokens (PAT) for API authentication.
21
+
3. Segment creates and updates the table's metadeta in Unity Catalog by running queries on a small, single node Databricks SQL warehouse in your environment.
22
+
4. If a table already exists and no new columns are introduced, Segment appends data to the table (no SQL required).
23
+
5. For new data types/columns, Segment reads the current schema for the table from the Unity Catalog and uses the SQL warehouse to update the schema accordingly.
12
24
13
25
## Prerequisites
14
26
@@ -32,9 +44,7 @@ As you set up Databricks, keep the following key terms in mind.
32
44
-**Target Unity Catalog**: The catalog where Segment lands your data.
33
45
-**Workspace Admin Token** (*PAT only*): The access token you'll generate for your Databricks workspace admin.
34
46
35
-
## Setup
36
-
37
-
Use the following nine steps to set up your Databricks Delta Lake destination with AWS.
47
+
## Setup for Databricks Delta Lake (S3)
38
48
39
49
### Step 1: Find your Databricks Workspace URL
40
50
@@ -149,7 +159,8 @@ This catalog is the target catalog where Segment lands your schemas/tables.
149
159
### Step 9: Setup the Databricks Delta Lake destination in Segment
150
160
151
161
This step links a Segment events source to your Databricks workspace/catalog.
152
-
1. Navigate to `https://app.segment.com/<WORKSPACE_SLUG>/destinations/catalog/databricks-delta-lake`.
162
+
1. From the Segment app, navigate to **Connections > Catalog**, then click **Destinations**.
163
+
2. Search for and select the "Databricks Delta Lake" destination.
153
164
2. Click **Add Destination**, select a source, then click **Next**.
154
165
3. Enter the name for your destination, then click **Create destination**.
With the Databricks Destination, you can ingest event data from Segment into the bronze layer of your Databricks Delta Lake.
6
+
With the Databricks Delta Lake Destination, you can ingest event data from Segment into the bronze layer of your Databricks Delta Lake.
7
7
8
8
This page will help you use the Databricks Destination to sync Segment events into your Databricks Delta Lake built on Azure (ADLS Gen 2).
9
9
10
10
11
-
> info "Databricks Delta Lake Destination in beta"
12
-
> The Databricks Delta Lake Destination is in beta, and Segment is actively working on this feature. Some functionality may change before it becomes generally available. [Contact Segment](https://segment.com/help/contact/){:target="_blank"} with any feedback or questions.
11
+
> info "Databricks Delta Lake Destination in public beta"
12
+
> The Databricks Delta Lake Destination is in public beta, and Segment is actively working on this integration. [Contact Segment](https://segment.com/help/contact/){:target="_blank"} with any feedback or questions.
13
+
14
+
## Overview
15
+
16
+
Before getting started, use the overview below to get up to familiarize yourself with Segment's Databricks Delta Lake Destination.
17
+
18
+
1. Segment writes directly to your Delta Lake in the cloud storage (Azure)
19
+
- Segment manages the creation and evolution of Delta tables.
20
+
- Segment uses a cross-tenant service principal to write Delta to ADLS Gen2.
21
+
2. Segment supports both OAuth and personal access tokens (PAT) for API authentication.
22
+
3. Segment creates and updates the table's metadeta in Unity Catalog by running queries on a small, single node Databricks SQL warehouse in your environment.
23
+
4. If a table already exists and no new columns are introduced, Segment appends data to the table (no SQL required).
24
+
5. For new data types/columns, Segment reads the current schema for the table from the Unity Catalog and uses the SQL warehouse to update the schema accordingly.
13
25
14
26
## Prerequisites
15
27
@@ -27,9 +39,7 @@ As you set up Databricks, keep the following key terms in mind.
27
39
-**Databricks Workspace URL**: The base URL for your Databricks workspace.
28
40
-**Target Unity Catalog**: The catalog where Segment lands your data.
29
41
30
-
## Set up Databricks with Azure
31
-
32
-
Use the following eight steps to setup your Databricks Delta Lake destination with Azure.
42
+
## Set up Databricks Delta Lake (Azure)
33
43
34
44
### Step 1: Find your Databricks Workspace URL
35
45
@@ -116,7 +126,8 @@ This catalog is the target catalog where Segment lands your schemas/tables.
116
126
### Step 8: Setup the Databricks Delta Lake destination in Segment
117
127
118
128
This step links a Segment source to your Databricks workspace/catalog.
119
-
1. Navigate to `https://app.segment.com/<WORKSPACE_SLUG>/destinations/catalog/databricks-delta-lake`.
129
+
1. From the Segment app, navigate to **Connections > Catalog**, then click **Destinations**.
130
+
2. Search for and select the "Databricks Delta Lake" destination.
120
131
2. Click **Add Destination**, select a source, then click **Next**.
121
132
3. Enter the name for your destination, then click **Create destination**.
122
-
4. Enter the connection settings using the values noted above (leave the service principal fields blank).
133
+
4. Enter the connection settings using the values noted above (leave the Service Principal fields blank).
0 commit comments