You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: src/connections/destinations/catalog/actions-kafka/index.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -100,4 +100,4 @@ The **Send** Action provides multiple ways to specify which Partition an event s
100
100
101
101
### What is the "SSL - Reject Unauthorized Certificate Authority" field for?
102
102
103
-
This field specifies if Segment should reject server connections when a certificate is notsigned by a trusted Certificate Authority (CA). This can be useful for testing purposes or when using a self-signed certificate.
103
+
This field specifies if Segment should reject server connections when a certificate is not signed by a trusted Certificate Authority (CA). This can be useful for testing purposes or when using a self-signed certificate.
Copy file name to clipboardExpand all lines: src/connections/functions/insert-functions.md
+26-36Lines changed: 26 additions & 36 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -371,50 +371,40 @@ The editor displays logs and request traces from the batch handler.
371
371
372
372
The [Public API](/docs/api/public-api) Functions/Preview endpoint also supports testing batch handlers. The payload must be a batch of events as a JSON array.
373
373
374
+
### Handling filtering in a batch
375
+
Events in a batch can be filtered out using custom logic. The filtered events will be surfaced in the [Event Delivery](/docs/connections/event-delivery/) page with reason as `Filtered at insert function`
374
376
375
-
### Handling batching errors
377
+
```js
378
+
asyncfunctiononBatch(events, settings) {
379
+
let response = [];
380
+
try {
381
+
for (consteventof events) {
382
+
// some business logic to filter event. Here filtering out all the events with name `drop`
383
+
if (event.properties.name==='drop') {
384
+
continue;
385
+
}
376
386
377
-
Standard [function error types](/docs/connections/functions/destination-functions/#destination-functions-error-types) apply to batch handlers. Segment attempts to retry the batch in the case of Timeout or Retry errors. For all other error types, Segment discards the batch. It's also possible to report a partial failure by returning status of each event in the batch. Segment retries only the failed events in a batch until those events are successful or until they result in a permanent error.
387
+
// some enrichments if needed
388
+
event.properties.message="Enriched from insert function";
378
389
379
-
```json
380
-
[
381
-
{
382
-
"status": 200
383
-
},
384
-
{
385
-
"status": 400,
386
-
"errormessage": "Bad Request"
387
-
},
388
-
{
389
-
"status": 200
390
-
},
391
-
{
392
-
"status": 500,
393
-
"errormessage": "Error processing request"
394
-
},
395
-
{
396
-
"status": 500,
397
-
"errormessage": "Error processing request"
398
-
},
399
-
{
400
-
"status": 200
401
-
},
402
-
]
390
+
// Enriched events are pushed to response
391
+
response.push(event);
392
+
}
393
+
} catch (error) {
394
+
console.log(error)
395
+
thrownewRetryError('Failed function', error);
396
+
}
397
+
398
+
// return a subset of transformed event
399
+
return response;
400
+
}
403
401
```
404
402
405
-
For example, after receiving the responses above from the `onBatch` handler, Segment only retries **event_4** and **event_5**.
406
403
407
-
| Error Type | Result |
408
-
| ---------------------- | ------- |
409
-
| Bad Request | Discard |
410
-
| Invalid Settings | Discard |
411
-
| Message Rejected | Discard |
412
-
| RetryError | Retry |
413
-
| Timeout | Retry |
414
-
| Unsupported Event Type | Discard |
404
+
### Handling batching errors
415
405
406
+
Standard [function error types](/docs/connections/functions/destination-functions/#destination-functions-error-types) apply to batch handlers. Segment attempts to retry the batch in the case of Timeout or Retry errors. For all other error types, Segment discards the batch.
At a high level, when you set up Databricks for Reverse ETL, the configured user needs read permissions for any resources (databases, schemas, tables) the query needs to access. Segment keeps track of changes to your query results with a managed schema (`__SEGMENT_REVERSE_ETL`), which requires the configured user to allow write permissions for that schema.
7
+
At a high level, when you set up Databricks for Reverse ETL, the configured service-principal needs read permissions for any resources (databases, schemas, tables) the query needs to access. Segment keeps track of changes to your query results with a managed schema (`__SEGMENT_REVERSE_ETL`), which requires the configured service-principal to allow write permissions for that schema.
8
+
9
+
> info ""
10
+
> Segment supports only OAuth (M2M) authentication. To generate a client ID and Secret, follow the steps listed in Databricks' [OAuth machine-to-machine (M2M) authentication](https://docs.databricks.com/en/dev-tools/auth/oauth-m2m.html){:target="_blank"} documentation.
11
+
8
12
9
13
## Required permissions
10
-
* Make sure the user or the service principal you use to connect to Segment has permissions to use that warehouse. In the Databricks console go to **SQL warehouses** and select the warehouse you're using. Navigate to **Overview > Permissions** and make sure the user or the service principal you use to connect to Segment has *can use* permissions.
14
+
* Make sure the service principal you use to connect to Segment has permissions to use that warehouse. In the Databricks console go to **SQL warehouses** and select the warehouse you're using. Navigate to **Overview > Permissions** and make sure the service principal you use to connect to Segment has *can use* permissions.
11
15
12
16
* To grant access to read data from the tables used in the model query, run:
13
17
14
18
```
15
-
GRANT USAGE ON SCHEMA <schema_name> TO `<user or service principal you are using to connect to Segment>`;
16
-
GRANT SELECT, READ_METADATA ON SCHEMA <schema_name> TO `<user or service principal you are using to connect to Segment>`;
19
+
GRANT USAGE ON SCHEMA <schema_name> TO `<service principal you are using to connect to Segment>`;
20
+
GRANT SELECT, READ_METADATA ON SCHEMA <schema_name> TO `<service principal you are using to connect to Segment>`;
17
21
```
18
22
19
23
* To grant Segment access to create a schema to keep track of the running syncs, run:
20
24
21
25
```
22
-
GRANT CREATE on catalog <name of the catalog, usually hive_metastore or main if using unity-catalog> TO `<user or service principal you are using to connect to Segment>`;
26
+
GRANT CREATE on catalog <name of the catalog, usually hive_metastore or main if using unity-catalog> TO `<service principal you are using to connect to Segment>`;
23
27
```
24
28
25
29
* If you want to create the schema yourself instead and then give Segment access to it, run:
26
30
27
31
```
28
32
CREATE SCHEMA IF NOT EXISTS __segment_reverse_etl;
29
-
GRANT ALL PRIVILEGES ON SCHEMA __segment_reverse_etl TO `<user or service principal you are using to connect to Segment>`;
33
+
GRANT ALL PRIVILEGES ON SCHEMA __segment_reverse_etl TO `<service principal you are using to connect to Segment>`;
30
34
```
31
35
32
36
## Set up guide
@@ -46,12 +50,14 @@ To set up Databricks as your Reverse ETL source:
46
50
* Hostname: `adb-xxxxxxx.azuredatabricks.net`
47
51
* Http Path: `/sql/1.0/warehouses/xxxxxxxxx`
48
52
* Port: `443` (default)
49
-
* Token: `<your-token>`
50
-
* Catalog [optional]: `hive_metastore` (default)
53
+
* Service principal client ID: `<your client ID>`
54
+
* OAuth secret: `<OAuth secret used during connection>`
55
+
* Catalog [optional]: If not specified, Segment will use the default catalog
51
56
11. Click **Test Connection** to see if the connection works. If the connection fails, make sure you have the right permissions and credentials, then try again.
52
57
12. Click **Add source** if the test connection is successful.
53
58
54
-
> info ""
55
-
> To generate a token, follow the steps listed in the [Databricks docs](https://docs.databricks.com/dev-tools/auth.html#pat){:target="_blank"}. Segment recommends you create a token with no expiration date by leaving the lifetime field empty when creating it. If you already have a token with an expiration date, be sure to keep track of the date and renew it on time.
59
+
> warning ""
60
+
> Segment previously supported token-based authentication, but now uses OAuth (M2M) authentication at the recommendation of Databricks.
61
+
> If you previously set up your source using token-based authentication, Segment will continue to support it. If you want to create a new source or update the connection settings of an existing source, Segment only supports [OAuth machine-to-machine (M2M) authentication](https://docs.databricks.com/en/dev-tools/auth/oauth-m2m.html){:target="_blank"}.
56
62
57
63
Once you've succesfully added your Databricks source, [add a model](/docs/connections/reverse-etl/#step-2-add-a-model) and follow the rest of the steps in the Reverse ETL setup guide.
0 commit comments