Skip to content

Commit b995901

Browse files
authored
Merge pull request #6221 from segmentio/master
`master` back to `develop`
2 parents ffd4bab + 5475c8b commit b995901

File tree

6 files changed

+185
-10
lines changed

6 files changed

+185
-10
lines changed

src/_data/sidenav/main.yml

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -185,6 +185,8 @@ sections:
185185
- section_title: Reverse ETL Source Setup Guides
186186
slug: connections/reverse-etl/reverse-etl-source-setup-guides
187187
section:
188+
- path: /connections/reverse-etl/reverse-etl-source-setup-guides/azure-setup
189+
title: Azure Reverse ETL Setup
188190
- path: /connections/reverse-etl/reverse-etl-source-setup-guides/bigquery-setup
189191
title: BigQuery Reverse ETL Setup
190192
- path: /connections/reverse-etl/reverse-etl-source-setup-guides/databricks-setup
@@ -358,6 +360,8 @@ sections:
358360
title: Profile Debugger
359361
- path: /unify/insights
360362
title: Profiles Insights
363+
- path: /unify/csv-upload
364+
title: CSV Upload
361365
- path: /unify/unify-gdpr
362366
title: Unify and GDPR
363367
- path: /unify/faqs

src/connections/reverse-etl/index.md

Lines changed: 9 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -38,9 +38,10 @@ To add your warehouse as a source:
3838
> You need to be a user that has both read and write access to the warehouse.
3939
4040
1. Navigate to **Connections > Sources** and select the **Reverse ETL** tab in the Segment app.
41-
2. Click **Add Reverse ETL source**.
41+
2. Click **+ Add Reverse ETL source**.
4242
3. Select the source you want to add.
4343
4. Follow the corresponding setup guide for your Reverse ETL source.
44+
* [Azure Reverse ETL setup guide](/docs/connections/reverse-etl/reverse-etl-source-setup-guides/azure-setup/)
4445
* [BigQuery Reverse ETL setup guide](/docs/connections/reverse-etl/reverse-etl-source-setup-guides/bigquery-setup/)
4546
* [Databricks Reverse ETL setup guide](/docs/connections/reverse-etl/reverse-etl-source-setup-guides/databricks-setup/)
4647
* [Postgres Reverse ETL setup guide](/docs/connections/reverse-etl/reverse-etl-source-setup-guides/postgres-setup/)
@@ -49,7 +50,7 @@ To add your warehouse as a source:
4950
5. Add the account information for your source.
5051
* For Snowflake users: Learn more about the Snowflake Account ID [here](https://docs.snowflake.com/en/user-guide/admin-account-identifier.html){:target="_blank"}.
5152
5. Click **Test Connection** to test to see if the connection works.
52-
6. Click **Create Source** if the test connection is successful.
53+
6. Click **Add source** if the test connection is successful.
5354

5455
After you add your data warehouse as a source, you can [add a model](#step-2-add-a-model) to your source.
5556

@@ -89,9 +90,6 @@ To add your first destination:
8990
### Step 4: Create mappings
9091
After you’ve added a destination, you can create mappings from your warehouse to the destination. Mappings enable you to map the data you extract from your warehouse to the fields in your destination.
9192

92-
> info ""
93-
> When you add new mappings to an existing model, Segment only syncs changes that have transpired since the last sync, not the entire dataset. For a comprehensive data synchronization, Segment recommends that you first recreate the model, then establish a one-to-one mapping with the new model. This ensures that all data syncs effectively.
94-
9593
To create a mapping:
9694
1. Navigate to **Conections > Destinations** and select the **Reverse ETL** tab.
9795
2. Select the destination that you want to create a mapping for.
@@ -151,7 +149,12 @@ You can opt in to receive email alerts regarding notifications for Reverse ETL.
151149
To subscribe to email alerts:
152150
1. Navigate to **Settings > User Preferences**.
153151
2. Select **Reverse ETL** in the **Activity Notifications** section.
154-
3. Click the toggle for **Reverse ETL Sync Failed** to receive notifications when your Reverse ETL sync fails.
152+
3. Click the toggle on for the notifications you want to receive. You can choose from:
153+
154+
Notification | Details
155+
------ | -------
156+
Reverse ETL Sync Failed | Set toggle on to receive notification when your Reverse ETL sync fails.
157+
Reverse ETL Sync Partial Success | Set toggle on to receive notification when your Reverse ETL sync is partially successful.
155158

156159
### Edit your model
157160

Lines changed: 79 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,79 @@
1+
---
2+
title: Azure Reverse ETL Setup
3+
---
4+
5+
> info "Public beta"
6+
> The Azure source for Reverse ETL is in public beta and is governed by Segment's [First Access and Beta Preview Terms](https://www.twilio.com/en-us/legal/tos){:target="_blank"}.
7+
8+
Set up Azure as your Reverse ETL source.
9+
10+
At a high level, when you set up Azure dedicated SQL pools for Reverse ETL, the configured user needs read permissions for any resources (databases, schemas, tables) the query needs to access. Segment keeps track of changes to your query results with a managed schema (`__SEGMENT_REVERSE_ETL`), which requires the configured user to allow write permissions for that schema.
11+
12+
## Required permissions
13+
Make sure the user you use to connect to Segment has permissions to use that warehouse. You can follow the process below to set up a new user with sufficient permissions for Segment’s use.
14+
15+
* To create a login in your master database, run:
16+
17+
```
18+
CREATE LOGIN <login name of your choice> WITH PASSWORD = 'Str0ng_password'; -- password of your choice
19+
```
20+
21+
> info ""
22+
> Execute the commands below in the database where your data resides.
23+
24+
* To create a user for Segment, run:
25+
26+
```
27+
CREATE USER <user name of your choice> FOR LOGIN <login name of your choice>;
28+
```
29+
30+
* To grant access to the user to read data from all schemas in the database, run:
31+
32+
```
33+
EXEC sp_addrolemember 'db_datareader', '<user name of your choice>';
34+
```
35+
36+
* To grant Segment access to read from certain schemas, run:
37+
38+
```
39+
CREATE ROLE <role name of your choice>;
40+
GRANT SELECT ON SCHEMA::[schema_name] TO <role name of your choice>;
41+
EXEC sp_addrolemember '<role name of your choice>', '<user name of your choice>';
42+
```
43+
44+
* To grant Segment access to create a schema to keep track of the running syncs, run:
45+
46+
```
47+
GRANT CREATE SCHEMA TO <user name of your choice>;
48+
```
49+
50+
* If you want to create the schema yourself and then give Segment access to it, run:
51+
52+
```
53+
CREATE SCHEMA __segment_reverse_etl;
54+
GRANT CONTROL ON SCHEMA::__segment_reverse_etl TO <user name of your choice>;
55+
GRANT CREATE TABLE ON DATABASE::[database_name] TO <user name of your choice>;
56+
```
57+
58+
## Set up guide
59+
To set up Azure as your Reverse ETL source:
60+
1. Log in to your Azure account.
61+
2. Navigate to your **dedicated SQL pool**. Segment supports both dedicated SQL pool (formerly SQL DW) and dedicated SQL pool in Synapse workspace.
62+
3. Navigate to **Settings > Connection strings** and select the **JDBC** tab to find the server, port, and database name.
63+
4. Open [your Segment workspace](https://app.segment.com/workspaces){:target="_blank"}.
64+
5. Navigate to **Connections > Sources** and select the **Reverse ETL** tab.
65+
6. Click **+ Add Reverse ETL source**.
66+
7. Select **Azure** and click **Add Source**.
67+
8. Enter the configuration settings for you Azure source based on the information from Step 3.
68+
* Hostname:
69+
* Use `xxxxxxx.sql.azuresynapse.net` if you’re connecting to a dedicated SQL pool in Synapse workspace.
70+
* Use `xxxxxxx.database.windows.net` if you’re connecting to a dedicated SQL pool (formerly SQL DW)
71+
* Port: `1433` (default)
72+
* Database name: The name of your dedicated SQL pool.
73+
* Username: The login name you created with `CREATE LOGIN` in the [required permissions](#required-permissions) section.
74+
* Password: The password that's associated with the login name.
75+
9. Click **Test Connection** to see if the connection works. If the connection fails, make sure you have the right permissions and credentials, then try again.
76+
10. Click **Add source** if the test connection is successful.
77+
78+
After you've successfully added your Azure source, [add a model](/docs/connections/reverse-etl/#step-2-add-a-model) and follow the rest of the steps in the Reverse ETL setup guide.
79+

src/connections/reverse-etl/reverse-etl-source-setup-guides/bigquery-setup.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -26,9 +26,9 @@ To set up the Segment BigQuery connector:
2626
14. Navigate to the Segment UI and paste all the credentials you copied from step 13 into the **Enter your credentials** section.
2727
19. Enter your **Data Location**.
2828
20. Click **Test Connection** to test to see if the connection works. If the connection fails, make sure you have the right permissions and credentials and try again.
29-
6. Click **Create Source** if the test connection is successful.
29+
6. Click **Add source** if the test connection is successful.
3030

31-
Once you've added BigQuery as a source, you can [add a model](/docs/connections/reverse-etl#step-2-add-a-model).
31+
After you've added BigQuery as a source, you can [add a model](/docs/connections/reverse-etl#step-2-add-a-model).
3232

3333
## Constructing your own role or policy
3434
When you construct your own role or policy, Segment needs the following permissions:

src/connections/reverse-etl/reverse-etl-source-setup-guides/databricks-setup.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,7 @@ To set up Databricks as your Reverse ETL source:
3939
4. Select **SQL Warehouses** and select the warehouse you want to use. Note that Segment doesn't support the `Compute` connection parameters.
4040
5. Go to the **Connection details** tab and **keep** this page open.
4141
6. Open [your Segment workspace](https://app.segment.com/workspaces){:target="_blank”}.
42-
7. Navigate to **Connections > Sources > Reverse ETL**.
42+
7. Navigate to **Connections > Sources** and select the **Reverse ETL** tab.
4343
8. Click **+ Add Reverse ETL source**.
4444
9. Select **Databricks** and click **Add Source**.
4545
10. Enter the configuration setting for your Databricks source based on information from step 5
@@ -49,7 +49,7 @@ To set up Databricks as your Reverse ETL source:
4949
* Token: `<your-token>`
5050
* Catalog [optional]: `hive_metastore` (default)
5151
11. Click **Test Connection** to see if the connection works. If the connection fails, make sure you have the right permissions and credentials, then try again.
52-
12. Click **Create Source** if the test connection is successful.
52+
12. Click **Add source** if the test connection is successful.
5353
5454
> info ""
5555
> To generate a token, follow the steps listed in the [Databricks docs](https://docs.databricks.com/dev-tools/auth.html#pat){:target="_blank"}. Segment recommends you create a token with no expiration date by leaving the lifetime field empty when creating it. If you already have a token with an expiration date, be sure to keep track of the date and renew it on time.

src/unify/csv-upload.md

Lines changed: 89 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,89 @@
1+
---
2+
title: Add or Update Profiles and Traits with a CSV
3+
plan: unify
4+
---
5+
You can use the Profiles CSV Uploader to add or update user profiles and traits. This page contains guidelines for your CSV upload and explains how to upload a CSV file to Unify.
6+
7+
## CSV file upload guidelines
8+
9+
Keep the following guidelines in mind as you upload CSV files:
10+
11+
- You can only upload `.csv` files.
12+
- Files can't be empty and must have at least one header and one row.
13+
- You can't have multiple columns with the same header.
14+
- CSV files cannot exceed 1 million rows (plus one header row), 299 columns, or 100 MB in file size.
15+
- You can only upload one file at a time.
16+
- Add an identifier column or `anonymous_id` in your identity resolution configuration.
17+
- Leave any unknown values blank to avoid bad data. Segment can create a user profile from a single identifier in your CSV.
18+
- The template won't include duplicate custom traits, traits with trailing, leading, or multiple consecutive spaces between characters, or [unallowed characters](#allowed-csv-file-characters).
19+
- Custom traits column headers are case-sensitive. For example, `first Name`, `FIRST Name`, and `First Name` would all be different traits in the template.
20+
- Trailing, leading, or multiple consecutive spaces between characters are not allowed.
21+
- The CSV uploader shares [Unify product limits](/docs/unify/product-limits/).
22+
23+
## Upload a CSV file
24+
25+
Use the **Upload CSV** page to upload a CSV file in your Segment space:
26+
27+
1. Navigate to **Unify > Profile explorer** or **Engage > Audiences > Profile explorer**.
28+
2. Click **+Add Profiles**.
29+
3. Download and fill out the CSV template.
30+
4. Upload your CSV file.
31+
32+
### 1. Download your CSV template
33+
34+
Click **Download Template** to download a CSV template with identifier columns from your identity resolution configuration.
35+
36+
### 2. Fill out your CSV file
37+
38+
Enter values for the identifiers in your CSV file.
39+
40+
### 3. Upload your CSV file
41+
42+
You can upload a CSV file in two ways:
43+
- Drag and drop the CSV file in the dropzone.
44+
- Click **Browse** to locate the CSV file.
45+
46+
## Work with the CSV template
47+
48+
Keep the following in mind as you fill out your CSV template.
49+
50+
### Allowed CSV file characters
51+
52+
You can use these characters in your CSV file:
53+
54+
- Alphabetic English characters in both upper and lower case
55+
- The numerals 0-9
56+
- These special characters: ```!@#$%^&*()_+-=[]{}:\\|.`~<>\/?```
57+
- The following non-English characters:
58+
59+
60+
```àáâäǎæãåāçćčċďðḍèéêëěẽēėęğġgg͟hħḥh̤ìíîïǐĩīıįķk͟hłļľl̥ṁm̐òóôöǒœøõōřṛr̥ɽßşșśšṣs̤s̱sțťþṭt̤ʈùúûüǔũūűůŵýŷÿźžżẓz̤ÀÁ
61+
ÄǍÆÃÅĀÇĆČĊĎÐḌÈÉÊËĚẼĒĖĘĞĠGG͟HĦḤH̤ÌÍÎÏǏĨĪIĮĶK͟HŁĻĽL̥ṀM̐ÒÓÔÖǑŒØÕŌŘṚR̥ɌSẞŚŠŞȘṢS̤S̱ȚŤÞṬT̤ƮÙÚÛÜǓŨŪŰŮŴÝŶŸŹŽŻẒZ```
62+
63+
## View Update History
64+
65+
Use the Update History page to view CSV file uploads in your workspace over the last 30 days.
66+
67+
To view the Update History page:
68+
69+
1. Navigate to **Unify > Profile explorer** or **Engage > Audiences > Profile explorer**.
70+
2. Click **View update history**.
71+
72+
### Validation errors
73+
74+
The following table lists validation errors you may run into with your profiles and traits CSV upload:
75+
76+
| Error | Error Message |
77+
| -------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
78+
| Invalid file types | You can upload only .csv files. Change your file format, then try again. |
79+
| Empty files | This file contains no data. Add data to your CSV, then try again. |
80+
| CSV parsing error | We encountered an issue while parsing your CSV file. Validate the CSV file and try again. |
81+
| Unexpected/fallback | Something went wrong. Try again later. |
82+
| Empty header row | This file contains empty header(s). Remove the empty header(s), then try again. |
83+
| File exceeds one million rows | Too many rows. You can upload up to 1000000 rows. |
84+
| File exceeds 299 columns | Your CSV file is exceeding the limit of 299 columns. |
85+
| File exceeds 100 MB | Files can be up to 100 MB. |
86+
| File contains a header with unallowed spaces | This file contains leading, trailing or consecutive spaces. Remove leading, trailing or consecutive spaces, then try again. |
87+
| File contains duplicate headers | This file contains duplicate header(s). Remove duplicate header(s), then try again. |
88+
| File contains invalid characters | This file contains invalid character(s). Remove invalid character(s), then try again. |
89+
| Unconfigured `anonymous_id` or missing Identifier column | This file is missing an identifier column and does not have `anonymous_id` configured. Add an identifier column or add `anonymous_id` in your identity resolution configuration, then try again. |

0 commit comments

Comments
 (0)