Skip to content

Commit cc31b37

Browse files
committed
More PM feedback [netlify-build]
1 parent 6d640b5 commit cc31b37

File tree

2 files changed

+6
-6
lines changed

2 files changed

+6
-6
lines changed

src/connections/storage/databricks-delta-lake/databricks-delta-lake-aws.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -5,9 +5,9 @@ beta: true
55

66
With the Databricks Delta Lake Destination, you can ingest event data from Segment into the bronze layer of your Databricks Delta Lake.
77

8-
This page will help you use the Databricks Destination to sync Segment events into your Databricks Delta Lake built on S3.
8+
This page will help you use the Databricks Delta Lake Destination to sync Segment events into your Databricks Delta Lake built on S3.
99

10-
> info "Databricks Delta Lake Destination in public beta"
10+
> info "Databricks Delta Lake Destination in Public Beta"
1111
> The Databricks Delta Lake Destination is in public beta, and Segment is actively working on this integration. [Contact Segment](https://segment.com/help/contact/){:target="_blank"} with any feedback or questions.
1212
1313
## Overview
@@ -16,7 +16,7 @@ Before getting started, use the overview below to get up to familiarize yourself
1616

1717
1. Segment writes directly to your Delta Lake in the cloud storage (S3)
1818
- Segment manages the creation and evolution of Delta tables.
19-
- Segment uses IAM role assumption to write Delta to AWS S3.
19+
- Segment uses IAM role assumption to write Delta tables to AWS S3.
2020
2. Segment supports both OAuth and personal access tokens (PAT) for API authentication.
2121
3. Segment creates and updates the table's metadeta in Unity Catalog by running queries on a small, single node Databricks SQL warehouse in your environment.
2222
4. If a table already exists and no new columns are introduced, Segment appends data to the table (no SQL required).

src/connections/storage/databricks-delta-lake/databricks-delta-lake-azure.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -5,10 +5,10 @@ beta: true
55

66
With the Databricks Delta Lake Destination, you can ingest event data from Segment into the bronze layer of your Databricks Delta Lake.
77

8-
This page will help you use the Databricks Destination to sync Segment events into your Databricks Delta Lake built on Azure (ADLS Gen 2).
8+
This page will help you use the Databricks Delta Lake Destination to sync Segment events into your Databricks Delta Lake built on Azure (ADLS Gen 2).
99

1010

11-
> info "Databricks Delta Lake Destination in public beta"
11+
> info "Databricks Delta Lake Destination in Public Beta"
1212
> The Databricks Delta Lake Destination is in public beta, and Segment is actively working on this integration. [Contact Segment](https://segment.com/help/contact/){:target="_blank"} with any feedback or questions.
1313
1414
## Overview
@@ -17,7 +17,7 @@ Before getting started, use the overview below to get up to familiarize yourself
1717

1818
1. Segment writes directly to your Delta Lake in the cloud storage (Azure)
1919
- Segment manages the creation and evolution of Delta tables.
20-
- Segment uses a cross-tenant service principal to write Delta to ADLS Gen2.
20+
- Segment uses a cross-tenant service principal to write Delta tables to ADLS Gen2.
2121
2. Segment supports both OAuth and personal access tokens (PAT) for API authentication.
2222
3. Segment creates and updates the table's metadeta in Unity Catalog by running queries on a small, single node Databricks SQL warehouse in your environment.
2323
4. If a table already exists and no new columns are introduced, Segment appends data to the table (no SQL required).

0 commit comments

Comments
 (0)