Skip to content

Commit c877be5

Browse files
committed
refactor: Clean up integration test structure
Signed-off-by: Edmund Miller <[email protected]>
1 parent c19c8e7 commit c877be5

File tree

5 files changed

+28
-26
lines changed

5 files changed

+28
-26
lines changed

README.md

Lines changed: 0 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -239,28 +239,3 @@ The `CSVREAD` function provided by the H2 database engine allows you to query an
239239
Like all dataflow operators in Nextflow, the operators provided by this plugin are executed asynchronously.
240240

241241
In particular, data inserted using the `sqlInsert` operator is _not_ guaranteed to be available to any subsequent queries using the `fromQuery` operator, as it is not possible to make a channel factory operation dependent on some upstream operation.
242-
243-
## Running Integration Tests
244-
245-
To run the integration tests, you'll need to set up the following environment variables:
246-
247-
### Databricks Integration Tests
248-
249-
For Databricks integration tests, you need to set:
250-
251-
```bash
252-
export DATABRICKS_JDBC_URL="jdbc:databricks://<workspace-url>:443/default;transportMode=http;ssl=1;httpPath=sql/protocolv1/o/<org-id>/<workspace-id>"
253-
export DATABRICKS_TOKEN="<your-databricks-token>"
254-
```
255-
256-
You can get these values from your Databricks workspace:
257-
1. The JDBC URL can be found in the Databricks SQL endpoint connection details
258-
2. The token can be generated from your Databricks user settings
259-
260-
After setting up the required environment variables, you can run the integration tests using:
261-
262-
```bash
263-
./gradlew test
264-
```
265-
266-
<!-- Note: Integration tests are skipped by default when running in smoke test mode (when `NXF_SMOKE` environment variable is set). -->

docs/databricks.md

Lines changed: 27 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,27 @@
1+
# Databricks integration
2+
3+
## Running Integration Tests
4+
5+
To run the integration tests, you'll need to set up the following environment variables:
6+
7+
### Databricks Integration Tests
8+
9+
For Databricks integration tests, you need to set:
10+
11+
```bash
12+
export DATABRICKS_JDBC_URL="jdbc:databricks://<workspace-url>:443/default;transportMode=http;ssl=1;httpPath=sql/protocolv1/o/<org-id>/<workspace-id>"
13+
export DATABRICKS_TOKEN="<your-databricks-token>"
14+
```
15+
16+
You can get these values from your Databricks workspace:
17+
18+
1. The JDBC URL can be found in the Databricks SQL endpoint connection details
19+
2. The token can be generated from your Databricks user settings
20+
21+
After setting up the required environment variables, you can run the integration tests using:
22+
23+
```bash
24+
./gradlew test
25+
```
26+
27+
<!-- Note: Integration tests are skipped by default when running in smoke test mode (when `NXF_SMOKE` environment variable is set). -->

plugins/nf-sqldb/src/test/groovy/nextflow/sql/SqlPluginIntegrationTest.groovy renamed to plugins/nf-sqldb/src/test/nextflow/sql/SqlPluginIntegrationTest.groovy

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ class SqlPluginIntegrationTest extends Specification {
2929
given:
3030
// Ensure test resources directory exists
3131
def testDir = Paths.get('plugins/nf-sqldb/src/testResources/testDir').toAbsolutePath()
32-
def scriptPath = testDir.resolve('test_sql_db.nf')
32+
def scriptPath = testDir.resolve('main.nf')
3333
def configPath = testDir.resolve('nextflow.config')
3434

3535
// Check if required files exist
File renamed without changes.
File renamed without changes.

0 commit comments

Comments
 (0)