You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: src/connections/functions/environment.md
+13Lines changed: 13 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -35,6 +35,19 @@ If you're editing an existing function, you can **Save** changes without changin
35
35
36
36
You can also choose to **Save & Deploy** to push changes to all, or specific functions in your workspace that are already deployed. You might need additional permissions to deploy these changes.
37
37
38
+
## Testing a function
39
+
You have the option to test your functions code with either a sample event or by loading a default event that you can customize yourself.
40
+
***Sample event**: When you click **Test with custom event** you can select a sample event from any of your workspace sources to test this function.
41
+
***Customize the event yourself**: When you click **customize the event yourself** a default event payload loads which you can modify with the desired data. You have the option to paste in a JSON event or click **Manual Mode** and type in the fields manually. If you'd like to locate a recent event from a source that's not available by following the sample event instruction:
42
+
1. Navigate to the source debugger.
43
+
2. Click the event you want to test and copy the raw JSON payload.
44
+
3. Paste the raw JSON payload into your Function Editor.
45
+
46
+
Once the payload you want to test is ready, click **Run**.
47
+
48
+
> info ""
49
+
> If you create settings in your function, then you need to fill in the setting values before clicking **Run**.
Copy file name to clipboardExpand all lines: src/connections/reverse-etl/index.md
+5Lines changed: 5 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -298,3 +298,8 @@ Column count | The maximum number of columns a single sync will process. | 512 c
298
298
Column name length | The maximum length of a record column. | 128 characters
299
299
Record JSON size | The maximum size for a record when converted to JSON (some of this limit is used by Segment). | 512 KiB
300
300
Column JSON size | The maximum size of any single column value. | 128 KiB
301
+
302
+
## FAQs
303
+
304
+
#### Why do my sync results show *No records extracted* when I select *Updated records* after I enable the mapping?
305
+
It's expected that when you select **Updated records** the records do not change after the first sync. During the first sync, the reverse ETL system calculates a snapshot of all the results and creates records in the `_segment_reverse_etl` schema. All the records are considered as *Added records* instead of *Updated records* at this time. The records can only meet the *Updated records* condition when the underlying values change after the first sync completes.
Copy file name to clipboardExpand all lines: src/connections/reverse-etl/reverse-etl-source-setup-guides/databricks-setup.md
+13-8Lines changed: 13 additions & 8 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -36,17 +36,22 @@ To set up Databricks as your Reverse ETL source:
36
36
1. Log in to your Databricks account.
37
37
2. Navigate to **Workspaces** and select the workspace you want to use.
38
38
3. Select **SQL** in the main navigation.
39
-
4. Select **SQL Warehouses** and select the warehouse you want to use. Note that Segment doesn't support the `Compute` connection parameter.
40
-
5. Go to the **Connection details** tab.
41
-
6. In a new tab on your browser, go to the Segment app.
39
+
4. Select **SQL Warehouses** and select the warehouse you want to use. Note that Segment doesn't support the `Compute` connection parameters.
40
+
5. Go to the **Connection details** tab and **keep** this page open.
41
+
6. Open [your Segment workspace](https://app.segment.com/workspaces){:target="_blank”}.
42
42
7. Navigate to **Connections > Sources > Reverse ETL**.
43
43
8. Click **+ Add Reverse ETL source**.
44
44
9. Select **Databricks** and click **Add Source**.
45
-
10. Enter the configuration settings for your Databricks source.
46
-
* Copy the Hostname, Http Path, and Port from the Databricks console from step 5.
47
-
* The Http Path in the above step should contain `/sql/1.0/warehouses`. Note that Segment doesn't support `/sql/protocolv1`.
48
-
* To generate a **Token**, follow the steps listed in the [Databricks docs](https://docs.databricks.com/dev-tools/auth.html#pat){:target="_blank"}. Segment recommends you create a token with no expiration date by leaving the lifetime field empty when creating it. If you already have a token with an expiration date, be sure to keep track of the date and renew it on time.
45
+
10. Enter the configuration setting for your Databricks source based on information from step 5
46
+
* Hostname: `adb-xxxxxxx.azuredatabricks.net`
47
+
* Http Path: `/sql/1.0/warehouses/xxxxxxxxx`
48
+
* Port: `443` (default)
49
+
* Token: `<your-token>`
50
+
* Catalog [optional]: `hive_metastore` (default)
49
51
11. Click **Test Connection** to see if the connection works. If the connection fails, make sure you have the right permissions and credentials, then try again.
50
52
12. Click **Create Source** if the test connection is successful.
51
53
52
-
Once you've added your Databricks source, [add a model](/docs/connections/reverse-etl/#step-2-add-a-model) and follow the rest of the steps in the Reverse ETL setup guide.
54
+
> info ""
55
+
> To generate a token, follow the steps listed in the [Databricks docs](https://docs.databricks.com/dev-tools/auth.html#pat){:target="_blank"}. Segment recommends you create a token with no expiration date by leaving the lifetime field empty when creating it. If you already have a token with an expiration date, be sure to keep track of the date and renew it on time.
56
+
57
+
Once you've succesfully added your Databricks source, [add a model](/docs/connections/reverse-etl/#step-2-add-a-model) and follow the rest of the steps in the Reverse ETL setup guide.
|`enrichment`| Executes as the first level of event processing. These plugins modify an event. <br><br> See the example of how Analytics.js uses the [Page Enrichment plugin](https://github.com/segmentio/analytics-next/blob/master/packages/browser/src/plugins/page-enrichment/index.ts){:target="_blank"} to enrich every event with page information.|
676
+
|`enrichment`| Executes as the first level of event processing. These plugins modify an event. |
677
677
|`destination`| Executes as events begin to pass off to destinations. <br><br> This doesn't modify the event outside of the specific destination, and failure doesn't halt the execution. |
678
678
|`after`| Executes after all event processing completes. You can use this to perform cleanup operations. <br><br>An example of this is the [Segment.io Plugin](https://github.com/segmentio/analytics-next/blob/master/packages/browser/src/plugins/segmentio/index.ts){:target="_blank"} which waits for destinations to succeed or fail so it can send it observability metrics. |
679
679
|`utility`| Executes once during the bootstrap, to give you an outlet to make any modifications as to how Analytics.js works internally. This allows you to augment Analytics.js functionality. |
0 commit comments