You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/data-management/connecting-dwh/bigquery.md
+8-1Lines changed: 8 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -14,7 +14,13 @@ Additionally, you will need to create a data environment for Eppo to write inter
14
14
4. Click **CREATE SERVICE ACCOUNT** in the service accounts header.
15
15
5. Under **Service account details**, add an _account name_, _ID_, and optional _description_.
16
16
6. Click **CREATE**.
17
-
7. Under **Service account permissions**, add the following role: `BigQuery Job User (roles/bigquery.jobUser)`
17
+
7. Under **Service account permissions**, add the following roles:
18
+
-`BigQuery Job User (roles/bigquery.jobUser)`
19
+
- Required
20
+
-`Storage Admin (roles/storage.admin)`
21
+
- Optional; required for using Eppo's [Track API](/sdks/event-logging/event-tracking)
22
+
- Scoped to the Storage bucket to use for temporary storage of events before loading into BigQuery
23
+
18
24
8. Click **CONTINUE**.
19
25
9. (optional) Under **Grant users access** you may choose to grant other users access to your new service account.
20
26
10. Click **CREATE KEY** to create a json [private key](https://cloud.google.com/iam/docs/creating-managing-service-account-keys).
@@ -62,6 +68,7 @@ Now that you have a proper Service Account created for Eppo with adequate privil
62
68
-**BigQuery Dataset** - `eppo_output`
63
69
-**BigQuery Project** - Name of the BQ project to which `eppo_output` belongs
64
70
-**BigQuery Region** - The region in which you created the `eppo_output` dataset
71
+
-**Storage Bucket (Optional)** - Cloud Storage bucket to use for staging of events logged with Eppo's [Track API](/sdks/event-logging/event-tracking) before inserting into BigQuery. Files will be automatically deleted from this bucket after insertion into BigQuery.
65
72
66
73
5. Enter the values into the form (which should look like the screenshot below), then click `Test Connection`. Once this test succeeds, save your settings by clicking `Test and Save Connection`.
Copy file name to clipboardExpand all lines: docs/data-management/connecting-dwh/redshift.md
+21-1Lines changed: 21 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -16,7 +16,7 @@ For Eppo to connect to your Redshift database, you’ll need to allow our inboun
16
16
6. Click **Add Rule** to add a new Inbound Rule.
17
17
a. Set the Type to **Redshift**.
18
18
b. Adjust the Port, if needed.
19
-
c. Enter the following into the Source field:
19
+
c. Enter the following into the Source field:
20
20
21
21
| IP Address |
22
22
| --- |
@@ -105,6 +105,25 @@ You'll want to gather the following connection details from Redshift:
105
105
106
106
You can also find your **Database Name** under the **Database configurations** section of the Properties tab.
107
107
108
+
#### (Optional) Event tracking
109
+
110
+
To use Eppo's [Event Tracking](/sdks/event-logging/event-tracking) with Redshift, additional configuration is required:
111
+
112
+
-**S3 Bucket**: Eppo will write events to this bucket before bulk inserting into Redshift.
113
+
114
+
Files will be automatically deleted from this bucket after insertion into Redshift.
115
+
116
+
-**AWS Region**: The region the Redshift cluster resides in
117
+
118
+
-**Access Key ID**: Credentials of the service account Eppo can use to upload files to the S3 bucket
119
+
120
+
-**Secret Access Key**: Credentials of the service account Eppo can use to upload files to the S3 bucket
121
+
122
+
-**AWS IAM Role**: IAM role to use when running `COPY INTO` operations to load data from S3 into the Redshift instance.
123
+
124
+
This role needs permissions to `LIST` and the contents of the above S3 bucket as well as `GET` objects within the S3 bucket.
125
+
126
+
108
127
#### (Optional) SSH Tunnel
109
128
110
129
Eppo supports connecting to a Redshift cluster over an SSH tunnel.
@@ -139,6 +158,7 @@ Now that you have a proper User created for Eppo with adequate privileges, you c
139
158
-**Database name** - **Database name** from [previous section](#gather-redshift-connection-details)
140
159
-**Schema name** - `eppo_output`
141
160
-**Port** - **Database port** from [previous section](#gather-redshift-connection-details)
161
+
-**[Optional] Event Tracking Configuration** - values from [previous section](#gather-redshift-connection-details)
142
162
143
163
4. Enter the values into the form (which should look like the screenshot below), then click `Test Connection`. Once this test succeeds, save your settings by clicking `Test and Save Connection`.
0 commit comments