Skip to content

Commit c19c8e7

Browse files
committed
docs: Remove .envrc and make a note about ENV
Signed-off-by: Edmund Miller <[email protected]>
1 parent a0df808 commit c19c8e7

File tree

2 files changed

+25
-6
lines changed

2 files changed

+25
-6
lines changed

.envrc

Lines changed: 0 additions & 6 deletions
This file was deleted.

README.md

Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -239,3 +239,28 @@ The `CSVREAD` function provided by the H2 database engine allows you to query an
239239
Like all dataflow operators in Nextflow, the operators provided by this plugin are executed asynchronously.
240240

241241
In particular, data inserted using the `sqlInsert` operator is _not_ guaranteed to be available to any subsequent queries using the `fromQuery` operator, as it is not possible to make a channel factory operation dependent on some upstream operation.
242+
243+
## Running Integration Tests
244+
245+
To run the integration tests, you'll need to set up the following environment variables:
246+
247+
### Databricks Integration Tests
248+
249+
For Databricks integration tests, you need to set:
250+
251+
```bash
252+
export DATABRICKS_JDBC_URL="jdbc:databricks://<workspace-url>:443/default;transportMode=http;ssl=1;httpPath=sql/protocolv1/o/<org-id>/<workspace-id>"
253+
export DATABRICKS_TOKEN="<your-databricks-token>"
254+
```
255+
256+
You can get these values from your Databricks workspace:
257+
1. The JDBC URL can be found in the Databricks SQL endpoint connection details
258+
2. The token can be generated from your Databricks user settings
259+
260+
After setting up the required environment variables, you can run the integration tests using:
261+
262+
```bash
263+
./gradlew test
264+
```
265+
266+
<!-- Note: Integration tests are skipped by default when running in smoke test mode (when `NXF_SMOKE` environment variable is set). -->

0 commit comments

Comments
 (0)