Skip to content

Commit af8663c

Browse files
authored
Modify warehouse setup pages to include steps for event tracking (#587)
* redshift event tracking setup instructions * bigquery event tracking setup instructions * Event Tracking -> Track API
1 parent 1a889aa commit af8663c

File tree

2 files changed

+29
-2
lines changed

2 files changed

+29
-2
lines changed

docs/data-management/connecting-dwh/bigquery.md

Lines changed: 8 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,13 @@ Additionally, you will need to create a data environment for Eppo to write inter
1414
4. Click **CREATE SERVICE ACCOUNT** in the service accounts header.
1515
5. Under **Service account details**, add an _account name_, _ID_, and optional _description_.
1616
6. Click **CREATE**.
17-
7. Under **Service account permissions**, add the following role: `BigQuery Job User (roles/bigquery.jobUser)`
17+
7. Under **Service account permissions**, add the following roles:
18+
- `BigQuery Job User (roles/bigquery.jobUser)`
19+
- Required
20+
- `Storage Admin (roles/storage.admin)`
21+
- Optional; required for using Eppo's [Track API](/sdks/event-logging/event-tracking)
22+
- Scoped to the Storage bucket to use for temporary storage of events before loading into BigQuery
23+
1824
8. Click **CONTINUE**.
1925
9. (optional) Under **Grant users access** you may choose to grant other users access to your new service account.
2026
10. Click **CREATE KEY** to create a json [private key](https://cloud.google.com/iam/docs/creating-managing-service-account-keys).
@@ -62,6 +68,7 @@ Now that you have a proper Service Account created for Eppo with adequate privil
6268
- **BigQuery Dataset** - `eppo_output`
6369
- **BigQuery Project** - Name of the BQ project to which `eppo_output` belongs
6470
- **BigQuery Region** - The region in which you created the `eppo_output` dataset
71+
- **Storage Bucket (Optional)** - Cloud Storage bucket to use for staging of events logged with Eppo's [Track API](/sdks/event-logging/event-tracking) before inserting into BigQuery. Files will be automatically deleted from this bucket after insertion into BigQuery.
6572

6673
5. Enter the values into the form (which should look like the screenshot below), then click `Test Connection`. Once this test succeeds, save your settings by clicking `Test and Save Connection`.
6774

docs/data-management/connecting-dwh/redshift.md

Lines changed: 21 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ For Eppo to connect to your Redshift database, you’ll need to allow our inboun
1616
6. Click **Add Rule** to add a new Inbound Rule.
1717
a. Set the Type to **Redshift**.
1818
b. Adjust the Port, if needed.
19-
c. Enter the following into the Source field:
19+
c. Enter the following into the Source field:
2020

2121
| IP Address |
2222
| --- |
@@ -105,6 +105,25 @@ You'll want to gather the following connection details from Redshift:
105105

106106
You can also find your **Database Name** under the **Database configurations** section of the Properties tab.
107107

108+
#### (Optional) Event tracking
109+
110+
To use Eppo's [Event Tracking](/sdks/event-logging/event-tracking) with Redshift, additional configuration is required:
111+
112+
- **S3 Bucket**: Eppo will write events to this bucket before bulk inserting into Redshift.
113+
114+
Files will be automatically deleted from this bucket after insertion into Redshift.
115+
116+
- **AWS Region**: The region the Redshift cluster resides in
117+
118+
- **Access Key ID**: Credentials of the service account Eppo can use to upload files to the S3 bucket
119+
120+
- **Secret Access Key**: Credentials of the service account Eppo can use to upload files to the S3 bucket
121+
122+
- **AWS IAM Role**: IAM role to use when running `COPY INTO` operations to load data from S3 into the Redshift instance.
123+
124+
This role needs permissions to `LIST` and the contents of the above S3 bucket as well as `GET` objects within the S3 bucket.
125+
126+
108127
#### (Optional) SSH Tunnel
109128

110129
Eppo supports connecting to a Redshift cluster over an SSH tunnel.
@@ -139,6 +158,7 @@ Now that you have a proper User created for Eppo with adequate privileges, you c
139158
- **Database name** - **Database name** from [previous section](#gather-redshift-connection-details)
140159
- **Schema name** - `eppo_output`
141160
- **Port** - **Database port** from [previous section](#gather-redshift-connection-details)
161+
- **[Optional] Event Tracking Configuration** - values from [previous section](#gather-redshift-connection-details)
142162

143163
4. Enter the values into the form (which should look like the screenshot below), then click `Test Connection`. Once this test succeeds, save your settings by clicking `Test and Save Connection`.
144164

0 commit comments

Comments
 (0)