Skip to content

Commit 2c11d67

Browse files
committed
capitalization fixes
1 parent 86e2535 commit 2c11d67

File tree

2 files changed

+3
-3
lines changed

2 files changed

+3
-3
lines changed

src/connections/storage/databricks-delta-lake/databricks-delta-lake-aws.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,7 @@ This step allows the Segment service principal to create and use a small SQL war
5454

5555
### Step 4: Create an external location and storage credentials
5656

57-
This step creates the storage location where Segment lands your delta lake and the associated credentials Segment uses to access the storage.
57+
This step creates the storage location where Segment lands your Delta Lake and the associated credentials Segment uses to access the storage.
5858
1. Follow the Databricks guide for [managing external locations and storage credentials](https://docs.databricks.com/en/data-governance/unity-catalog/manage-external-locations-and-credentials.html){:target="_blank"}. This guide assumes the target S3 bucket already exists. If not, follow the [AWS guide](https://docs.aws.amazon.com/AmazonS3/latest/userguide/create-bucket-overview.html){:target="_blank"} for creating a bucket.
5959
2. Once the external location and storage credentials are created in your Databricks workspace, update the permissions to allow access to the Segment service principal.
6060
1. In your workspace, navigate to **Data > External Data > Storage Credentials**.

src/connections/storage/databricks-delta-lake/databricks-delta-lake-azure.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -59,7 +59,7 @@ az ad sp create --id fffa5b05-1da5-4599-8360-cc2684bcdefb
5959

6060
### Step 3: Update or create an ADLS Gen2 storage container
6161

62-
The ADLS Gen2 storage container is where Segment lands your delta lake files.
62+
The ADLS Gen2 storage container is where Segment lands your Delta Lake files.
6363

6464
1. In the Azure console, navigate to **Storage accounts** and locate or create a new storage account to use for your Segment data.
6565
2. Select the account, then select **Containers**.
@@ -90,7 +90,7 @@ This step allows the Segment service principal to create a small SQL warehouse f
9090

9191
### Step 6: Create an external location and storage credentials
9292

93-
This step creates the storage location where Segment lands your delta lake and the associated credentials Segment uses to access the storage.
93+
This step creates the storage location where Segment lands your Delta Lake and the associated credentials Segment uses to access the storage.
9494
1. Follow the Databricks guide for [managing external locations and storage credentials](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-external-locations-and-credentials){:target="_blank"}.
9595
- Use the storage container you updated in step 3.
9696
- For storage credentials, you can use a service principal or managed identity.

0 commit comments

Comments
 (0)