Skip to content

Commit 1a8df6c

Browse files
Add default-minimal template (#3885)
## Changes Adds a new `default-minimal` template for advanced users who want a clean slate without sample code: * Added `default-minimal` which inherits from the `default` template, just like `default-python` * Enhanced template renderer to preserve empty directories (necessary to avoid `.gitkeep` files in `src/` and `resources/`) * Fixed `uv run pytest` error shown for projects that don't have a `src/<package>` directory ## Why Advanced customers and non-ETL customers (like @fjakobs) indicated they want an "empty" or "minimal" project template where they don't need to delete sample code. Rather than answering "no" to all questions in `default-python`, this gives them something more intentional. ## Tests * Template acceptance tests * Manual uv run pytest, databricks bundle deploy, databricks bundle run on generated project **Example:** ```bash $ databricks bundle init default-minimal Welcome to the minimal Databricks Asset Bundle template! This template creates a minimal project structure without sample code, ideal for advanced users. (For getting started with Python or SQL code, use the default-python or default-sql templates instead.) Your workspace at https://e2-dogfood.staging.cloud.databricks.com is used for initialization. (See https://docs.databricks.com/dev-tools/cli/profiles.html for how to change your profile.) Unique name for this project [my_project]: empty Default catalog for any tables created by this project [main]: Use a personal schema for each user working on this project. (This is recommended. Your personal schema will be 'main.lennart_kats'.): yes ✨ Your new project has been created in the 'empty' directory! To get started, refer to the project README.md file and the documentation at https://docs.databricks.com/dev-tools/bundles/index.html. ``` ``` $ find empty empty empty/resources empty/pyproject.toml empty/tests empty/tests/conftest.py empty/README.md empty/.gitignore empty/fixtures empty/fixtures/.gitkeep empty/.vscode empty/.vscode/__builtins__.pyi empty/.vscode/settings.json empty/.vscode/extensions.json empty/databricks.yml empty/src ``` --------- Co-authored-by: Claude <[email protected]>
1 parent 6d9bfa3 commit 1a8df6c

File tree

26 files changed

+541
-11
lines changed

26 files changed

+541
-11
lines changed

NEXT_CHANGELOG.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,7 @@
99
### Dependency updates
1010

1111
### Bundles
12+
* Add `default-minimal` template for users who want a clean slate without sample code ([#3885](https://github.com/databricks/cli/pull/3885))
1213
* Add validation that served_models and served_entities are not used at the same time. Add client side translation logic. ([#3880](https://github.com/databricks/cli/pull/3880))
1314

1415
### API Changes

acceptance/bundle/help/bundle-init/output.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,7 @@ Initialize using a bundle template to get started quickly.
55
TEMPLATE_PATH optionally specifies which template to use. It can be one of the following:
66
- default-python: The default Python template for Notebooks and Lakeflow
77
- default-sql: The default SQL template for .sql files that run with Databricks SQL
8+
- default-minimal: The minimal template, for advanced users
89
- dbt-sql: The dbt SQL template (databricks.com/blog/delivering-cost-effective-data-real-time-dbt-and-databricks)
910
- mlops-stacks: The Databricks MLOps Stacks template (github.com/databricks/mlops-stacks)
1011
- pydabs: A variant of the 'default-python' template that defines resources in Python instead of YAML
Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
{
2+
"project_name": "my_default_minimal",
3+
"include_job": "no",
4+
"include_pipeline": "no",
5+
"include_python": "no"
6+
}

acceptance/bundle/templates/default-minimal/out.test.toml

Lines changed: 5 additions & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.
Lines changed: 33 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,33 @@
1+
2+
>>> [CLI] bundle init default-minimal --config-file ./input.json --output-dir output
3+
Welcome to the minimal Databricks Asset Bundle template!
4+
5+
This template creates a minimal project structure without sample code, ideal for advanced users.
6+
(For getting started with Python or SQL code, use the default-python or default-sql templates instead.)
7+
8+
Your workspace at [DATABRICKS_URL] is used for initialization.
9+
(See https://docs.databricks.com/dev-tools/cli/profiles.html for how to change your profile.)
10+
11+
✨ Your new project has been created in the 'my_default_minimal' directory!
12+
13+
To get started, refer to the project README.md file and the documentation at https://docs.databricks.com/dev-tools/bundles/index.html.
14+
15+
>>> [CLI] bundle validate -t dev
16+
Name: my_default_minimal
17+
Target: dev
18+
Workspace:
19+
Host: [DATABRICKS_URL]
20+
User: [USERNAME]
21+
Path: /Workspace/Users/[USERNAME]/.bundle/my_default_minimal/dev
22+
23+
Validation OK!
24+
25+
>>> [CLI] bundle validate -t prod
26+
Name: my_default_minimal
27+
Target: prod
28+
Workspace:
29+
Host: [DATABRICKS_URL]
30+
User: [USERNAME]
31+
Path: /Workspace/Users/[USERNAME]/.bundle/my_default_minimal/prod
32+
33+
Validation OK!
Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
# Typings for Pylance in Visual Studio Code
2+
# see https://github.com/microsoft/pyright/blob/main/docs/builtins.md
3+
from databricks.sdk.runtime import *
Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
{
2+
"recommendations": [
3+
"databricks.databricks",
4+
"redhat.vscode-yaml",
5+
"ms-python.black-formatter"
6+
]
7+
}
Lines changed: 39 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,39 @@
1+
{
2+
"jupyter.interactiveWindow.cellMarker.codeRegex": "^# COMMAND ----------|^# Databricks notebook source|^(#\\s*%%|#\\s*\\<codecell\\>|#\\s*In\\[\\d*?\\]|#\\s*In\\[ \\])",
3+
"jupyter.interactiveWindow.cellMarker.default": "# COMMAND ----------",
4+
"python.testing.pytestArgs": [
5+
"."
6+
],
7+
"files.exclude": {
8+
"**/*.egg-info": true,
9+
"**/__pycache__": true,
10+
".pytest_cache": true,
11+
"dist": true,
12+
},
13+
"files.associations": {
14+
"**/.gitkeep": "markdown"
15+
},
16+
17+
// Pylance settings (VS Code)
18+
// Set typeCheckingMode to "basic" to enable type checking!
19+
"python.analysis.typeCheckingMode": "off",
20+
"python.analysis.extraPaths": ["src", "lib", "resources"],
21+
"python.analysis.diagnosticMode": "workspace",
22+
"python.analysis.stubPath": ".vscode",
23+
24+
// Pyright settings (Cursor)
25+
// Set typeCheckingMode to "basic" to enable type checking!
26+
"cursorpyright.analysis.typeCheckingMode": "off",
27+
"cursorpyright.analysis.extraPaths": ["src", "lib", "resources"],
28+
"cursorpyright.analysis.diagnosticMode": "workspace",
29+
"cursorpyright.analysis.stubPath": ".vscode",
30+
31+
// General Python settings
32+
"python.defaultInterpreterPath": "./.venv/bin/python",
33+
"python.testing.unittestEnabled": false,
34+
"python.testing.pytestEnabled": true,
35+
"[python]": {
36+
"editor.defaultFormatter": "ms-python.black-formatter",
37+
"editor.formatOnSave": true,
38+
},
39+
}
Lines changed: 62 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,62 @@
1+
# my_default_minimal
2+
3+
The 'my_default_minimal' project was generated by using the default template.
4+
5+
* `src/`: Python source code for this project.
6+
* `resources/`: Resource configurations (jobs, pipelines, etc.)
7+
* `tests/`: Unit tests for the shared Python code.
8+
* `fixtures/`: Fixtures for data sets (primarily used for testing).
9+
10+
11+
## Getting started
12+
13+
Choose how you want to work on this project:
14+
15+
(a) Directly in your Databricks workspace, see
16+
https://docs.databricks.com/dev-tools/bundles/workspace.
17+
18+
(b) Locally with an IDE like Cursor or VS Code, see
19+
https://docs.databricks.com/dev-tools/vscode-ext.html.
20+
21+
(c) With command line tools, see https://docs.databricks.com/dev-tools/cli/databricks-cli.html
22+
23+
If you're developing with an IDE, dependencies for this project should be installed using uv:
24+
25+
* Make sure you have the UV package manager installed.
26+
It's an alternative to tools like pip: https://docs.astral.sh/uv/getting-started/installation/.
27+
* Run `uv sync --dev` to install the project's dependencies.
28+
29+
30+
# Using this project using the CLI
31+
32+
The Databricks workspace and IDE extensions provide a graphical interface for working
33+
with this project. It's also possible to interact with it directly using the CLI:
34+
35+
1. Authenticate to your Databricks workspace, if you have not done so already:
36+
```
37+
$ databricks configure
38+
```
39+
40+
2. To deploy a development copy of this project, type:
41+
```
42+
$ databricks bundle deploy --target dev
43+
```
44+
(Note that "dev" is the default target, so the `--target` parameter
45+
is optional here.)
46+
47+
This deploys everything that's defined for this project.
48+
49+
3. Similarly, to deploy a production copy, type:
50+
```
51+
$ databricks bundle deploy --target prod
52+
```
53+
54+
4. To run a job or pipeline, use the "run" command:
55+
```
56+
$ databricks bundle run
57+
```
58+
59+
5. Finally, to run tests locally, use `pytest`:
60+
```
61+
$ uv run pytest
62+
```
Lines changed: 42 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,42 @@
1+
# This is a Databricks asset bundle definition for my_default_minimal.
2+
# See https://docs.databricks.com/dev-tools/bundles/index.html for documentation.
3+
bundle:
4+
name: my_default_minimal
5+
uuid: [UUID]
6+
7+
include:
8+
- resources/*.yml
9+
- resources/*/*.yml
10+
11+
# Variable declarations. These variables are assigned in the dev/prod targets below.
12+
variables:
13+
catalog:
14+
description: The catalog to use
15+
schema:
16+
description: The schema to use
17+
18+
targets:
19+
dev:
20+
# The default target uses 'mode: development' to create a development copy.
21+
# - Deployed resources get prefixed with '[dev my_user_name]'
22+
# - Any job schedules and triggers are paused by default.
23+
# See also https://docs.databricks.com/dev-tools/bundles/deployment-modes.html.
24+
mode: development
25+
default: true
26+
workspace:
27+
host: [DATABRICKS_URL]
28+
variables:
29+
catalog: hive_metastore
30+
schema: ${workspace.current_user.short_name}
31+
prod:
32+
mode: production
33+
workspace:
34+
host: [DATABRICKS_URL]
35+
# We explicitly deploy to /Workspace/Users/[USERNAME] to make sure we only have a single copy.
36+
root_path: /Workspace/Users/[USERNAME]/.bundle/${bundle.name}/${bundle.target}
37+
variables:
38+
catalog: hive_metastore
39+
schema: prod
40+
permissions:
41+
- user_name: [USERNAME]
42+
level: CAN_MANAGE

0 commit comments

Comments
 (0)