Skip to content

Commit 1efe73f

Browse files
committed
added support for serverless
1 parent d735d6d commit 1efe73f

File tree

4 files changed

+18
-1
lines changed

4 files changed

+18
-1
lines changed

README.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -375,6 +375,9 @@ See also [`log_account_link`](#log_account_link-fixture), [`make_acc_group`](#ma
375375
### `spark` fixture
376376
Get Databricks Connect Spark session. Requires `databricks-connect` package to be installed.
377377

378+
To enable serverless set the local environment variable DATABRICKS_SERVERLESS_COMPUTE_ID to auto.
379+
If this environment variable is set, Databricks Connect ignores the cluster_id.
380+
378381
Usage:
379382
```python
380383
def test_databricks_connect(spark):

src/databricks/labs/pytester/fixtures/connect.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -33,7 +33,8 @@ def test_databricks_connect(spark):
3333
)
3434

3535
if ws.config.serverless_compute_id:
36-
return DatabricksSession.builder.sdkConfig(ws.config).serverless(True).getOrCreate()
36+
logging.debug("Using serverless compute")
37+
return DatabricksSession.builder.serverless(True).getOrCreate()
3738
return DatabricksSession.builder.sdkConfig(ws.config).getOrCreate()
3839
except ImportError:
3940
skip("Please run `pip install databricks-connect`")

tests/integration/conftest.py

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,4 @@
1+
import os
12
import logging
23

34
from pytest import fixture
@@ -18,3 +19,10 @@ def debug_env_name():
1819
@fixture
1920
def product_info():
2021
return 'pytester', __version__
22+
23+
24+
@fixture
25+
def serverless_env():
26+
os.environ['DATABRICKS_SERVERLESS_COMPUTE_ID'] = 'auto'
27+
yield
28+
os.environ.pop('DATABRICKS_SERVERLESS_COMPUTE_ID')
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
11
def test_databricks_connect(spark):
22
rows = spark.sql("SELECT 1").collect()
33
assert rows[0][0] == 1
4+
5+
6+
def test_databricks_connect_serverless(serverless_env, spark):
7+
rows = spark.sql("SELECT 1").collect()
8+
assert rows[0][0] == 1

0 commit comments

Comments
 (0)