Skip to content

Commit 5961b34

Browse files
committed
Missing prereq, language updates [netlify-build]
1 parent bdb1760 commit 5961b34

File tree

2 files changed

+6
-7
lines changed

2 files changed

+6
-7
lines changed

src/connections/storage/catalog/data-lakes/index.md

Lines changed: 5 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -90,6 +90,7 @@ To set up Azure Data Lakes, create your Azure resources and then enable the Data
9090

9191
Before you can configure your Azure resources, you must complete the following prerequisites:
9292
- [Create an Azure subscription](https://azure.microsoft.com/en-us/free/){:target="_blank”}
93+
- [Create an Azure resource group](https://docs.microsoft.com/en-us/azure/azure-resource-manager/management/manage-resource-groups-portal#create-resource-groups){:target="_blank”}
9394
- Create an account with `Microsoft.Authorization/roleAssignments/write` permissions
9495
- Configure the [Azure Command Line Interface (Azure CLI)](https://docs.microsoft.com/en-us/cli/azure/install-azure-cli){:target="_blank”}
9596

@@ -115,7 +116,7 @@ Before you can configure your Azure resources, you must complete the following p
115116
14. Select **Container**. Give your container a name, and select the **Private** level of public access. Click **Create**.
116117

117118
> warning " "
118-
> Before continuing, note the Location, Storage account name, and the Azure storage container name: you'll need these variables when configuring the Azure Data Lakes destination in the Segment app.
119+
> Before continuing, note the Location, Storage account name, and the Azure storage container name: you'll need this information when configuring the Azure Data Lakes destination in the Segment app.
119120
120121
### Step 2 - Set up Key Vault
121122

@@ -161,7 +162,7 @@ Before you can configure your Azure resources, you must complete the following p
161162
```
162163

163164
> warning " "
164-
> Before continuing, note the MySQL server URL, username and password for the admin account, and your database name: you'll need these variables when configuring the Azure Data Lakes destination in the Segment app.
165+
> Before continuing, note the MySQL server URL, username and password for the admin account, and your database name: you'll need this information when configuring the Azure Data Lakes destination in the Segment app.
165166
166167

167168
### Step 4 - Set up Databricks
@@ -191,7 +192,7 @@ Before you can configure your Azure resources, you must complete the following p
191192
16. When you've entered all of your information, click **Create**.
192193

193194
> warning " "
194-
> Before continuing, note the Cluster ID, Workspace name, Workspace URL, and the Azure Resource Group for your Databricks Workspace: you'll need these variables when configuring the Azure Data Lakes destination in the Segment app.
195+
> Before continuing, note the Cluster ID, Workspace name, Workspace URL, and the Azure Resource Group for your Databricks Workspace: you'll need this information when configuring the Azure Data Lakes destination in the Segment app.
195196
196197
### Step 5 - Set up a Service Principal
197198

@@ -248,7 +249,7 @@ curl -X POST 'https://<per-workspace-url>/api/2.0/preview/scim/v2/ServicePrincip
248249
30. In the permissions menu, grant your service principal **Can Manage** permissions.
249250

250251
> warning " "
251-
> Before continuing, note the Client ID and Client Secret for your Service Principal: you'll need these variables when configuring the Azure Data Lakes destination in the Segment app.
252+
> Before continuing, note the Client ID and Client Secret for your Service Principal: you'll need this information when configuring the Azure Data Lakes destination in the Segment app.
252253
253254
### Step 6 - Configure Databricks Cluster
254255

@@ -501,6 +502,4 @@ spark.sql.hive.metastore.schema.verification.record.version false
501502
Check your Spark configs to ensure that the information you entered about the database is correct, then restart the cluster. The Databricks cluster automatically initializes the Hive Metastore, so an issue with your config file will stop the table from being created. If you continue to encounter errors, [contact Segment Support](https://segment.com/help/contact/){:target="_blank"}.
502503
{% endfaqitem %}
503504

504-
505-
506505
{% endfaq %}

src/connections/storage/data-lakes/lake-formation.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ title: Lake Formation
44

55
{% include content/plan-grid.md name="data-lakes" %}
66

7-
Lake Formation is a fully managed service built on top of the AWS Glue Data Catalog that provides one central set of tools to build and manage a Data Lake. These tools help import, catalog, transform, and deduplicate data, as well as provide strategies to optimize data storage and security. To learn more about Lake Formation features, refer to the [Amazon Web Services documentation](https://aws.amazon.com/lake-formation/features/){:target="_blank"}.
7+
Lake Formation is a fully managed service built on top of the AWS Glue Data Catalog that provides one central set of tools to build and manage a Data Lake. These tools help import, catalog, transform, and deduplicate data, as well as provide strategies to optimize data storage and security. To learn more about Lake Formation features, see [Amazon Web Services documentation](https://aws.amazon.com/lake-formation/features/){:target="_blank"}.
88

99
> note "This feature is not supported in the Azure Data Lakes public beta"
1010
> Lake Formation is only supported for Segment Data Lakes. For more information about Azure Data Lakes, see the [Data Lakes overview documentation](/docs/connections/storage/data-lakes/index/#how-azure-data-lakes-works).

0 commit comments

Comments
 (0)