You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: src/connections/storage/aws-privatelink-beta.md
+45-35Lines changed: 45 additions & 35 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -8,55 +8,65 @@ hidden: true
8
8
> info ""
9
9
> Segment's PrivateLink integration is currently in public beta and is governed by Segment’s [First Access and Beta Preview Terms](https://www.twilio.com/en-us/legal/tos){:target="_blank”}. Only warehouses located in region `us-east-1` are eligible for PrivateLink. You might incur additional networking costs while using AWS PrivateLink.
10
10
11
-
## Getting started
12
-
13
11
You can set up AWS PrivateLink for [Databricks](#databricks), [RDS Postgres](#rds-postgres), and [Redshift](#redshift).
14
12
15
-
###Databricks
13
+
## Databricks
16
14
17
15
> info "Segment recommends reviewing the Databricks documentation before attempting AWS PrivateLink setup"
18
16
> The setup required to configure the Databricks PrivateLink integration requires front-end and back-end PrivateLink configuration. Review the [Databricks documentation on AWS PrivateLink](https://docs.databricks.com/en/security/network/classic/privatelink.html){:target="_blank”} to ensure you have everything required to set up this configuration before continuing.
19
17
20
-
#### Prerequisites
21
-
22
-
Before you can configure AWS PrivateLink for Databricks:
23
-
- Your Databricks account must be on the [Enterprise pricing tier](https://www.databricks.com/product/pricing/platform-addons){:target="_blank”} and use the [E2 version](https://docs.databricks.com/en/archive/aws/end-of-life-legacy-workspaces.html#e2-architecture){:target="_blank”} of the platform.
24
-
- Your Databricks workspace must use a [Customer-managed VPC](https://docs.databricks.com/en/security/network/classic/customer-managed-vpc.html){:target="_blank”} and [Secure cluster connectivity](https://docs.databricks.com/en/security/network/classic/secure-cluster-connectivity.html){:target="_blank”}
25
-
- You must have the AWS permissions required to [set up a new Databricks workspace](https://docs.databricks.com/en/admin/workspace/create-workspace.html#before-you-begin){:target="_blank”} and [create a VPC](https://docs.aws.amazon.com/vpc/latest/privatelink/getting-started.html#create-vpc-subnets){:target="_blank”}
26
-
- You must have a technical parter in your organization to support the PrivateLink integration.
27
-
-
28
-
29
-
> warning "Only warehouses in the `us-east-1` region support Segment's PrivateLink integration"
30
-
> Create a Databricks warehouse in a new region to use PrivateLink.
18
+
### Prerequisites
19
+
Before you can configure AWS PrivateLink for Databricks, complete the following prerequisites in your Databricks workspace:
20
+
- Databricks account must be on the [Enterprise pricing tier](https://www.databricks.com/product/pricing/platform-addons){:target="_blank”} and use the [E2 version](https://docs.databricks.com/en/archive/aws/end-of-life-legacy-workspaces.html#e2-architecture){:target="_blank”} of the platform.
21
+
- Databricks workspace must use a [Customer-managed VPC](https://docs.databricks.com/en/security/network/classic/customer-managed-vpc.html){:target="_blank”} and [Secure cluster connectivity.](https://docs.databricks.com/en/security/network/classic/secure-cluster-connectivity.html){:target="_blank”}
22
+
- Configure your [VPC](https://docs.databricks.com/en/security/network/classic/customer-managed-vpc.html){:target="_blank”} with DNS hostnames and DNS resolution
23
+
- Configure a [security group](https://docs.databricks.com/en/security/network/classic/customer-managed-vpc.html#security-groups){:target="_blank”} with bidirectional access to 0.0.0/0 and ports 443, 3306, 6666, 2443, and 8443-8451.
24
+
- Must have the AWS permissions required to [set up a new Databricks workspace](https://docs.databricks.com/en/admin/workspace/create-workspace.html#before-you-begin){:target="_blank”} and [create a VPC.](https://docs.aws.amazon.com/vpc/latest/privatelink/getting-started.html#create-vpc-subnets){:target="_blank”}
25
+
- You must have a technical partner in your organization to support the PrivateLink integration.
31
26
32
-
### Getting started
27
+
> warning ""
28
+
> Only resources in the `us-east-1` region support Segment's PrivateLink integration.
33
29
34
-
To configure PrivateLink for Databricks, follow the instructions in Databricks' [Enable private connectivity using AWS PrivateLink](https://docs.databricks.com/en/security/network/classic/privatelink.html){:target="_blank”} documentation. You must create a [back-end](https://docs.databricks.com/en/security/network/classic/privatelink.html#private-connectivity-overview){:target="_blank”} connection to integrate with Segment's front-end connection.
30
+
### Configure PrivateLink for Databricks
31
+
To configure PrivateLink for Databricks:
32
+
1. Follow the instructions in Databricks' [Enable private connectivity using AWS PrivateLink](https://docs.databricks.com/en/security/network/classic/privatelink.html){:target="_blank”} documentation. You must create a [back-end](https://docs.databricks.com/en/security/network/classic/privatelink.html#private-connectivity-overview){:target="_blank”} connection to integrate with Segment's front-end connection.
33
+
2. After you've configured a back-end connection for Databricks, request access to Segment's PrivateLink integration by reaching out to your Customer Success Manager (CSM).
34
+
3. Your CSM shares information with you about Segment's AWS Principal.
35
+
4. Add Segment's AWS Principal as an Allowed Principal to use the
35
36
36
-
After you've configured a back-end connection for Databricks, request access to Segment's PrivateLink integration by taking the following steps:
37
-
1. Open your [Databricks storage destination](/docs/connections/storage/catalog/databricks/), [Databricks Reverse ETL source](/docs/connections/reverse-etl/reverse-etl-source-setup-guides/databricks-setup/), or [Databricks Profiles Sync destination](/docs/unify/profiles-sync/profiles-sync-setup/databricks-profiles-sync/).
38
-
2. Navigate to **Settings > Connection**.
39
-
3. Click the **Request PrivateLink** button and fill out the
37
+
## RDS Postgres
40
38
41
-
<!--- todo: get context about what happens when users click the link--->
42
-
43
-
### RDS Postgres
39
+
> warning ""
40
+
> Only resources in the `us-east-1` region support Segment's PrivateLink integration.
44
41
45
42
1. Create a Network Load Balancer VPC endpoint service using the instructions in the [Create a service powered by AWS PrivateLink](https://docs.aws.amazon.com/vpc/latest/privatelink/create-endpoint-service.html){:target="_blank”} documentation.
46
-
2.[Reach out to Segment]() for more details about Segment's AWS principal.
47
-
3. Add the Segment AWS principal as an “Allowed Principal” to consume the service.
48
-
4.[Reach out to Segment]() and provide Segment's engineering team with the name of the service that you created above and the region that service is located in. Segment's engineering team provisions a VPC endpoint for the service in the Segment Edge VPC. After creating the VPC, Segment either provides you with private DNS so you can configure the feature in the Segment app or creates an RDS Postgres source or destination on your behalf already configured with the required connection settings.
43
+
2. Reach out to your Customer Success Manager (CSM) for more details about Segment's AWS principal.
44
+
3. Add the Segment AWS principal as an “Allowed Principal” to consume the Network Load Balancer VPC endpoint service you created in step 1.
45
+
4. Reach out to your CSM and provide them with the name of the service that you created above. Segment's engineering team provisions a VPC endpoint for the service in the Segment Edge VPC.
46
+
5. After creating the VPC, Segment either provides you with private DNS so you can configure the feature in the Segment app or creates an RDS Postgres integration in the Segment app on your behalf. This integration is already configured with the connection settings you need to power AWS PrivateLink. <br> The following RDS Postgres integrations support PrivateLink:
> Only resources in the `us-east-1` region support Segment's PrivateLink integration.
51
55
52
-
####Prerequisites
53
-
-**You're using the RA3 node type**: <br>Segment's PrivateLink integration requires you to use one of the following RA3 instances types:
56
+
### Prerequisites
57
+
-**You're using the RA3 node type**: <br>To access Segment's PrivateLink integration, use one of the following RA3 instance types:
54
58
- ra3.16xlarge
55
59
- ra3.4xlarge
56
60
- ra3.xlplus
57
-
-**You've enabled cluster relocation**: Cluster relocation migrates your cluster behind a proxy and keeps the cluster endpoint unchanged, even if your cluster needs to be migrated to a new Availability Zone duew to lack of resources. A consistent cluster endpoint makes it possible for Segment's Edge account and VPC to remain connected to your cluster.
58
-
-**Your warehouse is using port range 5431-5455 and 8191-8215**:
59
-
60
-
1.[Reach out to Segment]() and let the engineering team know you're interested in configuring AWS PrivateLink for Redshift. Segment's engineering team will then share information with you about Segment’s Edge account and VPC.
61
-
2. After you receive information from Segment about the Edge account and VPC, [grant cluster access to Segment's Edge account and VPC](https://docs.aws.amazon.com/redshift/latest/gsg/rs-gsg-connect-to-cluster.html){:target="_blank”}.
62
-
3. Segment creates a Redshift managed VPC endpoint within a Redshift subnet on your behalf, which creates an internal PrivateLink Endpoint URL. Segment will provide you with this URL, which you need to configure your Warehouse in the Segment app.
61
+
-**You've enabled cluster relocation**: Cluster relocation migrates your cluster behind a proxy and keeps the cluster endpoint unchanged, even if your cluster needs to be migrated to a new Availability Zone. A consistent cluster endpoint makes it possible for Segment's Edge account and VPC to remain connected to your cluster. To enable cluster relocation, follow the instructions in the AWS [Relocating your cluster](https://docs.aws.amazon.com/redshift/latest/mgmt/managing-cluster-recovery.html){:target="_blank”} documentation.
62
+
-**Your warehouse is using port range 5431-5455 and 8191-8215**: Clusters with cluster relocation enabled [might encounter an error if updated to include a port outside of this range](https://docs.aws.amazon.com/redshift/latest/mgmt/managing-cluster-recovery.html#:~:text=You%20can%20change%20to%20another%20port%20from%20the%20port%20range%20of%205431%2D5455%20or%208191%2D8215.%20(Don%27t%20change%20to%20a%20port%20outside%20the%20ranges.%20It%20results%20in%20an%20error.)){:target="_blank”}.
63
+
64
+
### Configure PrivateLink for Redshift
65
+
Implement Segment's PrivateLink integration by taking the following steps:
66
+
1. Let your Customer Success Manager (CSM) know that you're interested in PrivateLink. They will share information with you about Segment’s Edge account and VPC.
67
+
2. After you receive the Edge account and VPC, [grant cluster access to Segment's Edge account and VPC](https://docs.aws.amazon.com/redshift/latest/gsg/rs-gsg-connect-to-cluster.html){:target="_blank”}.
68
+
3. Segment creates a Redshift managed VPC endpoint within a Redshift subnet on your behalf, which creates an internal PrivateLink Endpoint URL. Segment then provides you with the internal PrivateLink Endpoint URL.
69
+
4. After Segment provides you with the URL, use it to update or create new Redshift integrations. The following integrations support PrivateLink:
0 commit comments