From 83648e31092931f47602f31df03be9c6028c2954 Mon Sep 17 00:00:00 2001 From: Lennart Kats Date: Sun, 26 Oct 2025 18:42:44 -0700 Subject: [PATCH 1/5] Add default-minimal template for advanced users MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit - Add minimal template positioned after default-sql in CLI selection - Provides only essential infrastructure: databricks.yml, pyproject.toml, README, empty src/ and resources/ directories, and basic test setup - Welcome message emphasizes this is for advanced users and recommends default-python/default-sql for those getting started - Template reuses default template structure via "template_dir": "../default" - Equivalent to default-python when answering "no" to all sample content prompts - Renderer improvements: preserve empty directories without needing .gitkeep files Users need a clean starting point for building bundles from scratch without sample code getting in the way. Advanced users know what they want to build and don't need the training wheels of sample notebooks and pipelines. - All template acceptance tests pass, including new default-minimal tests - Verified template instantiation creates correct directory structure 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude --- .../templates/default-minimal/input.json | 6 ++ .../templates/default-minimal/out.test.toml | 5 + .../templates/default-minimal/output.txt | 35 +++++++ .../.vscode/__builtins__.pyi | 3 + .../.vscode/extensions.json | 7 ++ .../my_default_minimal/.vscode/settings.json | 39 ++++++++ .../output/my_default_minimal/README.md | 63 ++++++++++++ .../output/my_default_minimal/databricks.yml | 42 ++++++++ .../my_default_minimal/fixtures/.gitkeep | 9 ++ .../output/my_default_minimal/out.gitignore | 10 ++ .../output/my_default_minimal/pyproject.toml | 29 ++++++ .../my_default_minimal/tests/conftest.py | 98 +++++++++++++++++++ .../bundle/templates/default-minimal/script | 11 +++ libs/template/renderer.go | 50 ++++++++++ libs/template/renderer_test.go | 3 +- libs/template/template.go | 7 ++ libs/template/template_test.go | 3 + .../databricks_template_schema.json | 98 +++++++++++++++++++ .../default/template/__preamble.tmpl | 11 +-- .../{{.project_name}}/resources/.gitkeep | 1 - .../template/{{.project_name}}/src/.gitkeep | 1 - 21 files changed, 520 insertions(+), 11 deletions(-) create mode 100644 acceptance/bundle/templates/default-minimal/input.json create mode 100644 acceptance/bundle/templates/default-minimal/out.test.toml create mode 100644 acceptance/bundle/templates/default-minimal/output.txt create mode 100644 acceptance/bundle/templates/default-minimal/output/my_default_minimal/.vscode/__builtins__.pyi create mode 100644 acceptance/bundle/templates/default-minimal/output/my_default_minimal/.vscode/extensions.json create mode 100644 acceptance/bundle/templates/default-minimal/output/my_default_minimal/.vscode/settings.json create mode 100644 acceptance/bundle/templates/default-minimal/output/my_default_minimal/README.md create mode 100644 acceptance/bundle/templates/default-minimal/output/my_default_minimal/databricks.yml create mode 100644 acceptance/bundle/templates/default-minimal/output/my_default_minimal/fixtures/.gitkeep create mode 100644 acceptance/bundle/templates/default-minimal/output/my_default_minimal/out.gitignore create mode 100644 acceptance/bundle/templates/default-minimal/output/my_default_minimal/pyproject.toml create mode 100644 acceptance/bundle/templates/default-minimal/output/my_default_minimal/tests/conftest.py create mode 100644 acceptance/bundle/templates/default-minimal/script create mode 100644 libs/template/templates/default-minimal/databricks_template_schema.json delete mode 100644 libs/template/templates/default/template/{{.project_name}}/resources/.gitkeep delete mode 100644 libs/template/templates/default/template/{{.project_name}}/src/.gitkeep diff --git a/acceptance/bundle/templates/default-minimal/input.json b/acceptance/bundle/templates/default-minimal/input.json new file mode 100644 index 0000000000..2e6fb96908 --- /dev/null +++ b/acceptance/bundle/templates/default-minimal/input.json @@ -0,0 +1,6 @@ +{ + "project_name": "my_default_minimal", + "include_job": "no", + "include_pipeline": "no", + "include_python": "no" +} diff --git a/acceptance/bundle/templates/default-minimal/out.test.toml b/acceptance/bundle/templates/default-minimal/out.test.toml new file mode 100644 index 0000000000..e092fd5ed6 --- /dev/null +++ b/acceptance/bundle/templates/default-minimal/out.test.toml @@ -0,0 +1,5 @@ +Local = true +Cloud = false + +[EnvMatrix] + DATABRICKS_BUNDLE_ENGINE = ["terraform", "direct-exp"] diff --git a/acceptance/bundle/templates/default-minimal/output.txt b/acceptance/bundle/templates/default-minimal/output.txt new file mode 100644 index 0000000000..a5e80bca84 --- /dev/null +++ b/acceptance/bundle/templates/default-minimal/output.txt @@ -0,0 +1,35 @@ + +>>> [CLI] bundle init default-minimal --config-file ./input.json --output-dir output +Welcome to the minimal Databricks Asset Bundle template! + +NOTE: this minimal template is intended for starting from scratch for advanced users. +Use the default-python and default-sql templates for getting started, they include (optional) sample code! + +Your workspace at [DATABRICKS_URL] is used for initialization +(see https://docs.databricks.com/dev-tools/cli/profiles.html for how to change your profile). +Workspace to use (auto-detected, edit in 'my_default_minimal/databricks.yml'): [DATABRICKS_URL] + +✨ Your new project has been created in the 'my_default_minimal' directory! + +Please refer to the README.md file for "getting started" instructions. +See also the documentation at https://docs.databricks.com/dev-tools/bundles/index.html. + +>>> [CLI] bundle validate -t dev +Name: my_default_minimal +Target: dev +Workspace: + Host: [DATABRICKS_URL] + User: [USERNAME] + Path: /Workspace/Users/[USERNAME]/.bundle/my_default_minimal/dev + +Validation OK! + +>>> [CLI] bundle validate -t prod +Name: my_default_minimal +Target: prod +Workspace: + Host: [DATABRICKS_URL] + User: [USERNAME] + Path: /Workspace/Users/[USERNAME]/.bundle/my_default_minimal/prod + +Validation OK! diff --git a/acceptance/bundle/templates/default-minimal/output/my_default_minimal/.vscode/__builtins__.pyi b/acceptance/bundle/templates/default-minimal/output/my_default_minimal/.vscode/__builtins__.pyi new file mode 100644 index 0000000000..0edd5181bc --- /dev/null +++ b/acceptance/bundle/templates/default-minimal/output/my_default_minimal/.vscode/__builtins__.pyi @@ -0,0 +1,3 @@ +# Typings for Pylance in Visual Studio Code +# see https://github.com/microsoft/pyright/blob/main/docs/builtins.md +from databricks.sdk.runtime import * diff --git a/acceptance/bundle/templates/default-minimal/output/my_default_minimal/.vscode/extensions.json b/acceptance/bundle/templates/default-minimal/output/my_default_minimal/.vscode/extensions.json new file mode 100644 index 0000000000..75a111a6a9 --- /dev/null +++ b/acceptance/bundle/templates/default-minimal/output/my_default_minimal/.vscode/extensions.json @@ -0,0 +1,7 @@ +{ + "recommendations": [ + "databricks.databricks", + "redhat.vscode-yaml", + "ms-python.black-formatter" + ] +} diff --git a/acceptance/bundle/templates/default-minimal/output/my_default_minimal/.vscode/settings.json b/acceptance/bundle/templates/default-minimal/output/my_default_minimal/.vscode/settings.json new file mode 100644 index 0000000000..c49593bc59 --- /dev/null +++ b/acceptance/bundle/templates/default-minimal/output/my_default_minimal/.vscode/settings.json @@ -0,0 +1,39 @@ +{ + "jupyter.interactiveWindow.cellMarker.codeRegex": "^# COMMAND ----------|^# Databricks notebook source|^(#\\s*%%|#\\s*\\|#\\s*In\\[\\d*?\\]|#\\s*In\\[ \\])", + "jupyter.interactiveWindow.cellMarker.default": "# COMMAND ----------", + "python.testing.pytestArgs": [ + "." + ], + "files.exclude": { + "**/*.egg-info": true, + "**/__pycache__": true, + ".pytest_cache": true, + "dist": true, + }, + "files.associations": { + "**/.gitkeep": "markdown" + }, + + // Pylance settings (VS Code) + // Set typeCheckingMode to "basic" to enable type checking! + "python.analysis.typeCheckingMode": "off", + "python.analysis.extraPaths": ["src", "lib", "resources"], + "python.analysis.diagnosticMode": "workspace", + "python.analysis.stubPath": ".vscode", + + // Pyright settings (Cursor) + // Set typeCheckingMode to "basic" to enable type checking! + "cursorpyright.analysis.typeCheckingMode": "off", + "cursorpyright.analysis.extraPaths": ["src", "lib", "resources"], + "cursorpyright.analysis.diagnosticMode": "workspace", + "cursorpyright.analysis.stubPath": ".vscode", + + // General Python settings + "python.defaultInterpreterPath": "./.venv/bin/python", + "python.testing.unittestEnabled": false, + "python.testing.pytestEnabled": true, + "[python]": { + "editor.defaultFormatter": "ms-python.black-formatter", + "editor.formatOnSave": true, + }, +} diff --git a/acceptance/bundle/templates/default-minimal/output/my_default_minimal/README.md b/acceptance/bundle/templates/default-minimal/output/my_default_minimal/README.md new file mode 100644 index 0000000000..1a3623430a --- /dev/null +++ b/acceptance/bundle/templates/default-minimal/output/my_default_minimal/README.md @@ -0,0 +1,63 @@ +# my_default_minimal + +The 'my_default_minimal' project was generated by using the default template. + +* `src/`: Python source code for this project. +* `resources/`: Resource configurations (jobs, pipelines, etc.) +* `tests/`: Unit tests for the shared Python code. +* `fixtures/`: Fixtures for data sets (primarily used for testing). + + +## Getting started + +Choose how you want to work on this project: + +(a) Directly in your Databricks workspace, see + https://docs.databricks.com/dev-tools/bundles/workspace. + +(b) Locally with an IDE like Cursor or VS Code, see + https://docs.databricks.com/vscode-ext. + +(c) With command line tools, see https://docs.databricks.com/dev-tools/cli/databricks-cli.html + +If you're developing with an IDE, dependencies for this project should be installed using uv: + +* Make sure you have the UV package manager installed. + It's an alternative to tools like pip: https://docs.astral.sh/uv/getting-started/installation/. +* Run `uv sync --dev` to install the project's dependencies. + + +# Using this project using the CLI + +The Databricks workspace and IDE extensions provide a graphical interface for working +with this project. It's also possible to interact with it directly using the CLI: + +1. Authenticate to your Databricks workspace, if you have not done so already: + ``` + $ databricks configure + ``` + +2. To deploy a development copy of this project, type: + ``` + $ databricks bundle deploy --target dev + ``` + (Note that "dev" is the default target, so the `--target` parameter + is optional here.) + + This deploys everything that's defined for this project. + +3. Similarly, to deploy a production copy, type: + ``` + $ databricks bundle deploy --target prod + ``` + +4. To run a job or pipeline, use the "run" command: + ``` + $ databricks bundle run + ``` + +5. Finally, to run tests locally, use `pytest`: + ``` + $ uv run pytest + ``` + diff --git a/acceptance/bundle/templates/default-minimal/output/my_default_minimal/databricks.yml b/acceptance/bundle/templates/default-minimal/output/my_default_minimal/databricks.yml new file mode 100644 index 0000000000..897135948c --- /dev/null +++ b/acceptance/bundle/templates/default-minimal/output/my_default_minimal/databricks.yml @@ -0,0 +1,42 @@ +# This is a Databricks asset bundle definition for my_default_minimal. +# See https://docs.databricks.com/dev-tools/bundles/index.html for documentation. +bundle: + name: my_default_minimal + uuid: [UUID] + +include: + - resources/*.yml + - resources/*/*.yml + +# Variable declarations. These variables are assigned in the dev/prod targets below. +variables: + catalog: + description: The catalog to use + schema: + description: The schema to use + +targets: + dev: + # The default target uses 'mode: development' to create a development copy. + # - Deployed resources get prefixed with '[dev my_user_name]' + # - Any job schedules and triggers are paused by default. + # See also https://docs.databricks.com/dev-tools/bundles/deployment-modes.html. + mode: development + default: true + workspace: + host: [DATABRICKS_URL] + variables: + catalog: hive_metastore + schema: ${workspace.current_user.short_name} + prod: + mode: production + workspace: + host: [DATABRICKS_URL] + # We explicitly deploy to /Workspace/Users/[USERNAME] to make sure we only have a single copy. + root_path: /Workspace/Users/[USERNAME]/.bundle/${bundle.name}/${bundle.target} + variables: + catalog: hive_metastore + schema: prod + permissions: + - user_name: [USERNAME] + level: CAN_MANAGE diff --git a/acceptance/bundle/templates/default-minimal/output/my_default_minimal/fixtures/.gitkeep b/acceptance/bundle/templates/default-minimal/output/my_default_minimal/fixtures/.gitkeep new file mode 100644 index 0000000000..77a906614c --- /dev/null +++ b/acceptance/bundle/templates/default-minimal/output/my_default_minimal/fixtures/.gitkeep @@ -0,0 +1,9 @@ +# Test fixtures directory + +Add JSON or CSV files here. In tests, use them with `load_fixture()`: + +``` +def test_using_fixture(load_fixture): + data = load_fixture("my_data.json") + assert len(data) >= 1 +``` diff --git a/acceptance/bundle/templates/default-minimal/output/my_default_minimal/out.gitignore b/acceptance/bundle/templates/default-minimal/output/my_default_minimal/out.gitignore new file mode 100644 index 0000000000..e566c51f74 --- /dev/null +++ b/acceptance/bundle/templates/default-minimal/output/my_default_minimal/out.gitignore @@ -0,0 +1,10 @@ +.databricks/ +build/ +dist/ +__pycache__/ +*.egg-info +.venv/ +scratch/** +!scratch/README.md +**/explorations/** +**/!explorations/README.md diff --git a/acceptance/bundle/templates/default-minimal/output/my_default_minimal/pyproject.toml b/acceptance/bundle/templates/default-minimal/output/my_default_minimal/pyproject.toml new file mode 100644 index 0000000000..f9dcf5c2aa --- /dev/null +++ b/acceptance/bundle/templates/default-minimal/output/my_default_minimal/pyproject.toml @@ -0,0 +1,29 @@ +[project] +name = "my_default_minimal" +version = "0.0.1" +authors = [{ name = "[USERNAME]" }] +requires-python = ">=3.10,<=3.13" +dependencies = [ + # Any dependencies for jobs and pipelines in this project can be added here + # See also https://docs.databricks.com/dev-tools/bundles/library-dependencies + # + # LIMITATION: for pipelines, dependencies are cached during development; + # add dependencies to the 'environment' section of pipeline.yml file instead +] + +[dependency-groups] +dev = [ + "pytest", + "databricks-dlt", + "databricks-connect>=15.4,<15.5", +] + +[project.scripts] +main = "my_default_minimal.main:main" + +[build-system] +requires = ["hatchling"] +build-backend = "hatchling.build" + +[tool.black] +line-length = 125 diff --git a/acceptance/bundle/templates/default-minimal/output/my_default_minimal/tests/conftest.py b/acceptance/bundle/templates/default-minimal/output/my_default_minimal/tests/conftest.py new file mode 100644 index 0000000000..4df274fd43 --- /dev/null +++ b/acceptance/bundle/templates/default-minimal/output/my_default_minimal/tests/conftest.py @@ -0,0 +1,98 @@ +"""This file configures pytest. + +This file is in the root since it can be used for tests in any place in this +project, including tests under resources/. +""" + +import os, sys, pathlib +from contextlib import contextmanager + + +try: + from databricks.connect import DatabricksSession + from databricks.sdk import WorkspaceClient + from pyspark.sql import SparkSession + import pytest + import json + import csv + import os +except ImportError: + raise ImportError( + "Test dependencies not found.\n\nRun tests using 'uv run pytest'. See http://docs.astral.sh/uv to learn more about uv." + ) + + +@pytest.fixture() +def spark() -> SparkSession: + """Provide a SparkSession fixture for tests. + + Minimal example: + def test_uses_spark(spark): + df = spark.createDataFrame([(1,)], ["x"]) + assert df.count() == 1 + """ + return DatabricksSession.builder.getOrCreate() + + +@pytest.fixture() +def load_fixture(spark: SparkSession): + """Provide a callable to load JSON or CSV from fixtures/ directory. + + Example usage: + + def test_using_fixture(load_fixture): + data = load_fixture("my_data.json") + assert data.count() >= 1 + """ + + def _loader(filename: str): + path = pathlib.Path(__file__).parent.parent / "fixtures" / filename + suffix = path.suffix.lower() + if suffix == ".json": + rows = json.loads(path.read_text()) + return spark.createDataFrame(rows) + if suffix == ".csv": + with path.open(newline="") as f: + rows = list(csv.DictReader(f)) + return spark.createDataFrame(rows) + raise ValueError(f"Unsupported fixture type for: {filename}") + + return _loader + + +def _enable_fallback_compute(): + """Enable serverless compute if no compute is specified.""" + conf = WorkspaceClient().config + if conf.serverless_compute_id or conf.cluster_id or os.environ.get("SPARK_REMOTE"): + return + + url = "https://docs.databricks.com/dev-tools/databricks-connect/cluster-config" + print("☁️ no compute specified, falling back to serverless compute", file=sys.stderr) + print(f" see {url} for manual configuration", file=sys.stdout) + + os.environ["DATABRICKS_SERVERLESS_COMPUTE_ID"] = "auto" + + +@contextmanager +def _allow_stderr_output(config: pytest.Config): + """Temporarily disable pytest output capture.""" + capman = config.pluginmanager.get_plugin("capturemanager") + if capman: + with capman.global_and_fixture_disabled(): + yield + else: + yield + + +def pytest_configure(config: pytest.Config): + """Configure pytest session.""" + with _allow_stderr_output(config): + _enable_fallback_compute() + + # Initialize Spark session eagerly, so it is available even when + # SparkSession.builder.getOrCreate() is used. For DB Connect 15+, + # we validate version compatibility with the remote cluster. + if hasattr(DatabricksSession.builder, "validateSession"): + DatabricksSession.builder.validateSession().getOrCreate() + else: + DatabricksSession.builder.getOrCreate() diff --git a/acceptance/bundle/templates/default-minimal/script b/acceptance/bundle/templates/default-minimal/script new file mode 100644 index 0000000000..4a4730d7c2 --- /dev/null +++ b/acceptance/bundle/templates/default-minimal/script @@ -0,0 +1,11 @@ +trace $CLI bundle init default-minimal --config-file ./input.json --output-dir output + +cd output/my_default_minimal +trace $CLI bundle validate -t dev +trace $CLI bundle validate -t prod + +# Do not affect this repository's git behaviour #2318 +mv .gitignore out.gitignore +rm -r .databricks + +cd ../../ diff --git a/libs/template/renderer.go b/libs/template/renderer.go index 3658f28e94..4416211b19 100644 --- a/libs/template/renderer.go +++ b/libs/template/renderer.go @@ -50,6 +50,16 @@ type renderer struct { // do not match any glob patterns from this list skipPatterns []string + // Directories visited during the template walk. These directories will be created + // during persistToDisk even if they end up empty (all their files were skipped). + // + // WHY: The template's directory structure is part of its design and should be + // preserved even when empty. For example, an empty 'src/' directory shows users + // where to put their source code, and an empty 'resources/' directory indicates + // where to define bundle resources. Empty directories are only omitted if there's + // an explicit {{skip}} directive for that directory in the template. + visitedDirs []string + // [fs.FS] that holds the template's file tree. srcFS fs.FS } @@ -94,6 +104,7 @@ func newRenderer( baseTemplate: tmpl, files: make([]file, 0), skipPatterns: make([]string, 0), + visitedDirs: make([]string, 0), srcFS: srcFS, }, nil } @@ -213,6 +224,10 @@ func (r *renderer) computeFile(relPathTemplate string) (file, error) { // // This is not possible using the std library WalkDir which processes the files in // lexical order which is why this function implements BFS. +// +// BFS order (breadth-first search) also ensures that parent directories are visited +// before their children, which is important for creating empty directories during +// persistToDisk - parent directories must exist before we can create subdirectories. func (r *renderer) walk() error { directories := []string{"."} var currentDirectory string @@ -234,6 +249,11 @@ func (r *renderer) walk() error { continue } + // Track visited directory so it can be created even if empty. + // We preserve the directory structure because it's part of the template's design, + // guiding users where to place their files (e.g., src/ for source code). + r.visitedDirs = append(r.visitedDirs, instanceDirectory) + // Add skip function, which accumulates skip patterns relative to current // directory r.baseTemplate.Funcs(template.FuncMap{ @@ -326,6 +346,36 @@ func (r *renderer) persistToDisk(ctx context.Context, out filer.Filer) error { return err } } + + // Ensure all visited directories exist, preserving the template's directory structure. + // Empty directories (where all files were skipped) are still created because: + // 1. The directory structure is part of the template's design + // 2. Empty directories guide users on where to put their own files (e.g., src/, resources/) + // 3. Only explicit {{skip}} directives should prevent directory creation + for _, dir := range r.visitedDirs { + // Skip the root directory (current working directory) + if dir == "." { + continue + } + + // Check if directory already exists (may have been created during file writes) + _, err := out.Stat(ctx, dir) + if err == nil { + // Directory already exists, nothing to do + continue + } + if !errors.Is(err, fs.ErrNotExist) { + return fmt.Errorf("error checking if directory %s exists: %w", dir, err) + } + + // Create the directory. Since visitedDirs is in BFS order (parents before children), + // parent directories are guaranteed to exist by the time we create children. + err = out.Mkdir(ctx, dir) + if err != nil { + return fmt.Errorf("failed to create directory %s: %w", dir, err) + } + } + return nil } diff --git a/libs/template/renderer_test.go b/libs/template/renderer_test.go index f0dbf1335e..6c77aedfb6 100644 --- a/libs/template/renderer_test.go +++ b/libs/template/renderer_test.go @@ -452,7 +452,8 @@ func TestRendererSkip(t *testing.T) { // These files have been skipped assert.NoFileExists(t, filepath.Join(tmpDir, "file3")) assert.NoFileExists(t, filepath.Join(tmpDir, "dir1/file4")) - assert.NoDirExists(t, filepath.Join(tmpDir, "dir2")) + // dir2 exists (visited during walk) even though all its files were skipped + assert.DirExists(t, filepath.Join(tmpDir, "dir2")) assert.NoFileExists(t, filepath.Join(tmpDir, "dir2/file6")) } diff --git a/libs/template/template.go b/libs/template/template.go index e707090f7d..e1628b409a 100644 --- a/libs/template/template.go +++ b/libs/template/template.go @@ -25,6 +25,7 @@ type TemplateName string const ( DefaultPython TemplateName = "default-python" + DefaultMinimal TemplateName = "default-minimal" ExperimentalDefaultPython TemplateName = "experimental-default-python-vnext" DefaultSql TemplateName = "default-sql" LakeflowPipelines TemplateName = "lakeflow-pipelines" @@ -50,6 +51,12 @@ var databricksTemplates = []Template{ Reader: &builtinReader{name: string(DefaultSql)}, Writer: &writerWithFullTelemetry{defaultWriter: defaultWriter{name: DefaultSql}}, }, + { + name: DefaultMinimal, + description: "The minimal template, for advanced users", + Reader: &builtinReader{name: string(DefaultMinimal)}, + Writer: &writerWithFullTelemetry{defaultWriter: defaultWriter{name: DefaultMinimal}}, + }, { name: LakeflowPipelines, hidden: true, diff --git a/libs/template/template_test.go b/libs/template/template_test.go index 7613d708a4..9a5bd88e05 100644 --- a/libs/template/template_test.go +++ b/libs/template/template_test.go @@ -10,6 +10,7 @@ import ( func TestTemplateHelpDescriptions(t *testing.T) { expected := `- default-python: The default Python template for Notebooks and Lakeflow - default-sql: The default SQL template for .sql files that run with Databricks SQL +- default-minimal: The minimal template, for advanced users - dbt-sql: The dbt SQL template (databricks.com/blog/delivering-cost-effective-data-real-time-dbt-and-databricks) - mlops-stacks: The Databricks MLOps Stacks template (github.com/databricks/mlops-stacks) - pydabs: A variant of the 'default-python' template that defines resources in Python instead of YAML` @@ -20,6 +21,7 @@ func TestTemplateOptions(t *testing.T) { expected := []cmdio.Tuple{ {Name: "default-python", Id: "The default Python template for Notebooks and Lakeflow"}, {Name: "default-sql", Id: "The default SQL template for .sql files that run with Databricks SQL"}, + {Name: "default-minimal", Id: "The minimal template, for advanced users"}, {Name: "dbt-sql", Id: "The dbt SQL template (databricks.com/blog/delivering-cost-effective-data-real-time-dbt-and-databricks)"}, {Name: "mlops-stacks", Id: "The Databricks MLOps Stacks template (github.com/databricks/mlops-stacks)"}, {Name: "pydabs", Id: "A variant of the 'default-python' template that defines resources in Python instead of YAML"}, @@ -53,6 +55,7 @@ func TestTemplateTelemetryIsCapturedForAllDefaultTemplates(t *testing.T) { func TestTemplateGetDatabricksTemplate(t *testing.T) { names := []TemplateName{ DefaultPython, + DefaultMinimal, DefaultSql, DbtSql, MlopsStacks, diff --git a/libs/template/templates/default-minimal/databricks_template_schema.json b/libs/template/templates/default-minimal/databricks_template_schema.json new file mode 100644 index 0000000000..fdab731675 --- /dev/null +++ b/libs/template/templates/default-minimal/databricks_template_schema.json @@ -0,0 +1,98 @@ +{ + "template_dir": "../default", + "welcome_message": "Welcome to the minimal Databricks Asset Bundle template!\n\nNOTE: this minimal template is intended for starting from scratch for advanced users.\nUse the default-python and default-sql templates for getting started, they include (optional) sample code!\n\nYour workspace at {{workspace_host}} is used for initialization\n(see https://docs.databricks.com/dev-tools/cli/profiles.html for how to change your profile).", + "properties": { + "project_name": { + "type": "string", + "default": "my_project", + "description": "\nUnique name for this project", + "order": 1, + "pattern": "^[A-Za-z0-9_]+$", + "pattern_match_failure_message": "Name must consist of letters, numbers, and underscores." + }, + "project_name_short": { + "skip_prompt_if": {}, + "type": "string", + "default": "{{with (regexp \"^(my_)?(.*)(_project|_app|_service)?$\").FindStringSubmatch .project_name}}{{index . 2}}{{else}}{{.project_name}}{{end}}", + "description": "Short name for the project", + "order": 2 + }, + "include_job": { + "skip_prompt_if": {}, + "type": "string", + "default": "no", + "enum": ["yes", "no"], + "description": "Include a stub (sample) notebook job", + "order": 3 + }, + "include_pipeline": { + "skip_prompt_if": {}, + "type": "string", + "default": "no", + "enum": ["yes", "no"], + "description": "Include a stub (sample) Lakeflow Declarative Pipeline", + "order": 4 + }, + "include_python": { + "skip_prompt_if": {}, + "type": "string", + "default": "no", + "enum": ["yes", "no"], + "description": "Include a stub (sample) Python package", + "order": 5 + }, + "serverless": { + "skip_prompt_if": {}, + "type": "string", + "default": "yes", + "enum": ["yes", "no"], + "description": "Use serverless compute", + "order": 6 + }, + "enable_pydabs": { + "skip_prompt_if": {}, + "type": "string", + "default": "no", + "enum": ["yes", "no"], + "description": "Enable PyDABs", + "order": 7 + }, + "language": { + "skip_prompt_if": {}, + "type": "string", + "default": "python", + "enum": ["python", "sql"], + "description": "Language", + "order": 8 + }, + "lakeflow_only": { + "skip_prompt_if": {}, + "type": "string", + "default": "no", + "enum": ["yes", "no"], + "description": "Lakeflow only", + "order": 9 + }, + "default_catalog": { + "skip_prompt_if": {}, + "type": "string", + "default": "{{default_catalog}}", + "pattern": "^\\w*$", + "pattern_match_failure_message": "Invalid catalog name.", + "description": "Default catalog for any tables created by this project{{if eq (default_catalog) \"\"}} (leave blank when not using Unity Catalog){{end}}", + "order": 10 + }, + "personal_schemas": { + "skip_prompt_if": {}, + "type": "string", + "description": "Use a personal schema for each user working on this project\n(this is recommended, your personal schema will be '{{.default_catalog}}.{{short_name}}')", + "default": "yes", + "enum": [ + "yes", + "no (advanced: I will customize the schema configuration later in databricks.yml)" + ], + "order": 11 + } + }, + "success_message": "Workspace to use (auto-detected, edit in '{{.project_name}}/databricks.yml'): {{workspace_host}}\n\n✨ Your new project has been created in the '{{.project_name}}' directory!\n\nPlease refer to the README.md file for \"getting started\" instructions.\nSee also the documentation at https://docs.databricks.com/dev-tools/bundles/index.html." +} diff --git a/libs/template/templates/default/template/__preamble.tmpl b/libs/template/templates/default/template/__preamble.tmpl index 7360caf451..4e8d5cd203 100644 --- a/libs/template/templates/default/template/__preamble.tmpl +++ b/libs/template/templates/default/template/__preamble.tmpl @@ -20,12 +20,7 @@ This file only contains template directives; it is skipped for the actual output {{skip "{{.project_name}}/src/{{.project_name}}"}} {{end}} -{{if or $python_package $notebook_job $pipeline}} - # Remove .gitkeep files for a non-empty project - {{skip "{{.project_name}}/src/.gitkeep"}} - {{skip "{{.project_name}}/resources/.gitkeep"}} -{{else}} - # Fully empty project, even remove the sample job +{{if not (or $python_package $notebook_job $pipeline)}} {{skip "{{.project_name}}/resources/sample_job.job.yml"}} {{skip "{{.project_name}}/resources/sample_job.py"}} {{end}} @@ -41,10 +36,10 @@ This file only contains template directives; it is skipped for the actual output {{end}} {{if or $sql_language $lakeflow_only}} - # Do not include tests when using SQL or when created from the Lakeflow workspace UI - # (until we have better UI support for tests) {{skip "{{.project_name}}/tests"}} {{skip "{{.project_name}}/fixtures"}} +{{else if not (or $python_package $notebook_job $pipeline)}} + {{skip "{{.project_name}}/tests/sample_*.py"}} {{end}} {{if $pydabs}} diff --git a/libs/template/templates/default/template/{{.project_name}}/resources/.gitkeep b/libs/template/templates/default/template/{{.project_name}}/resources/.gitkeep deleted file mode 100644 index 3e09c14c18..0000000000 --- a/libs/template/templates/default/template/{{.project_name}}/resources/.gitkeep +++ /dev/null @@ -1 +0,0 @@ -This folder is reserved for Databricks Asset Bundles resource definitions. diff --git a/libs/template/templates/default/template/{{.project_name}}/src/.gitkeep b/libs/template/templates/default/template/{{.project_name}}/src/.gitkeep deleted file mode 100644 index 0e0ed1e00b..0000000000 --- a/libs/template/templates/default/template/{{.project_name}}/src/.gitkeep +++ /dev/null @@ -1 +0,0 @@ -This folder is reserved for Databricks Asset Bundles source files. From f7fc7386176f8989640bbed2411e9556b116ebe1 Mon Sep 17 00:00:00 2001 From: Lennart Kats Date: Wed, 5 Nov 2025 15:58:22 +0100 Subject: [PATCH 2/5] Improve default-minimal template schema consistency MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Updates the default-minimal template to be fully consistent with the default-python template structure: - Remove project_name_short property (not used in minimal template) - Fix property order numbering (1-10) to match default-python - Update property descriptions to match default-python exactly - Improve welcome message: "For getting started with Python or SQL code..." - Add // comments to hidden properties for clarity - Fix default_catalog to use hive_metastore fallback like default-python These changes ensure users have a consistent experience across all default templates while maintaining the minimal template's purpose of providing a clean starting point without sample code. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude --- acceptance/bundle/help/bundle-init/output.txt | 1 + .../templates/default-minimal/out.test.toml | 2 +- .../templates/default-minimal/output.txt | 12 ++- .../output/my_default_minimal/README.md | 3 +- .../databricks_template_schema.json | 79 +++++++++---------- 5 files changed, 47 insertions(+), 50 deletions(-) diff --git a/acceptance/bundle/help/bundle-init/output.txt b/acceptance/bundle/help/bundle-init/output.txt index b4b8552492..0713467068 100644 --- a/acceptance/bundle/help/bundle-init/output.txt +++ b/acceptance/bundle/help/bundle-init/output.txt @@ -5,6 +5,7 @@ Initialize using a bundle template to get started quickly. TEMPLATE_PATH optionally specifies which template to use. It can be one of the following: - default-python: The default Python template for Notebooks and Lakeflow - default-sql: The default SQL template for .sql files that run with Databricks SQL +- default-minimal: The minimal template, for advanced users - dbt-sql: The dbt SQL template (databricks.com/blog/delivering-cost-effective-data-real-time-dbt-and-databricks) - mlops-stacks: The Databricks MLOps Stacks template (github.com/databricks/mlops-stacks) - pydabs: A variant of the 'default-python' template that defines resources in Python instead of YAML diff --git a/acceptance/bundle/templates/default-minimal/out.test.toml b/acceptance/bundle/templates/default-minimal/out.test.toml index e092fd5ed6..d560f1de04 100644 --- a/acceptance/bundle/templates/default-minimal/out.test.toml +++ b/acceptance/bundle/templates/default-minimal/out.test.toml @@ -2,4 +2,4 @@ Local = true Cloud = false [EnvMatrix] - DATABRICKS_BUNDLE_ENGINE = ["terraform", "direct-exp"] + DATABRICKS_BUNDLE_ENGINE = ["terraform", "direct"] diff --git a/acceptance/bundle/templates/default-minimal/output.txt b/acceptance/bundle/templates/default-minimal/output.txt index a5e80bca84..5d0538b038 100644 --- a/acceptance/bundle/templates/default-minimal/output.txt +++ b/acceptance/bundle/templates/default-minimal/output.txt @@ -2,17 +2,15 @@ >>> [CLI] bundle init default-minimal --config-file ./input.json --output-dir output Welcome to the minimal Databricks Asset Bundle template! -NOTE: this minimal template is intended for starting from scratch for advanced users. -Use the default-python and default-sql templates for getting started, they include (optional) sample code! +This template creates a minimal project structure without sample code, ideal for advanced users. +(For getting started with Python or SQL code, use the default-python or default-sql templates instead.) -Your workspace at [DATABRICKS_URL] is used for initialization -(see https://docs.databricks.com/dev-tools/cli/profiles.html for how to change your profile). -Workspace to use (auto-detected, edit in 'my_default_minimal/databricks.yml'): [DATABRICKS_URL] +Your workspace at [DATABRICKS_URL] is used for initialization. +(See https://docs.databricks.com/dev-tools/cli/profiles.html for how to change your profile.) ✨ Your new project has been created in the 'my_default_minimal' directory! -Please refer to the README.md file for "getting started" instructions. -See also the documentation at https://docs.databricks.com/dev-tools/bundles/index.html. +To get started, refer to the project README.md file and the documentation at https://docs.databricks.com/dev-tools/bundles/index.html. >>> [CLI] bundle validate -t dev Name: my_default_minimal diff --git a/acceptance/bundle/templates/default-minimal/output/my_default_minimal/README.md b/acceptance/bundle/templates/default-minimal/output/my_default_minimal/README.md index 1a3623430a..2d7f9c9f17 100644 --- a/acceptance/bundle/templates/default-minimal/output/my_default_minimal/README.md +++ b/acceptance/bundle/templates/default-minimal/output/my_default_minimal/README.md @@ -16,7 +16,7 @@ Choose how you want to work on this project: https://docs.databricks.com/dev-tools/bundles/workspace. (b) Locally with an IDE like Cursor or VS Code, see - https://docs.databricks.com/vscode-ext. + https://docs.databricks.com/dev-tools/vscode-ext.html. (c) With command line tools, see https://docs.databricks.com/dev-tools/cli/databricks-cli.html @@ -60,4 +60,3 @@ with this project. It's also possible to interact with it directly using the CLI ``` $ uv run pytest ``` - diff --git a/libs/template/templates/default-minimal/databricks_template_schema.json b/libs/template/templates/default-minimal/databricks_template_schema.json index fdab731675..6c88858d93 100644 --- a/libs/template/templates/default-minimal/databricks_template_schema.json +++ b/libs/template/templates/default-minimal/databricks_template_schema.json @@ -1,6 +1,6 @@ { "template_dir": "../default", - "welcome_message": "Welcome to the minimal Databricks Asset Bundle template!\n\nNOTE: this minimal template is intended for starting from scratch for advanced users.\nUse the default-python and default-sql templates for getting started, they include (optional) sample code!\n\nYour workspace at {{workspace_host}} is used for initialization\n(see https://docs.databricks.com/dev-tools/cli/profiles.html for how to change your profile).", + "welcome_message": "Welcome to the minimal Databricks Asset Bundle template!\n\nThis template creates a minimal project structure without sample code, ideal for advanced users.\n(For getting started with Python or SQL code, use the default-python or default-sql templates instead.)\n\nYour workspace at {{workspace_host}} is used for initialization.\n(See https://docs.databricks.com/dev-tools/cli/profiles.html for how to change your profile.)", "properties": { "project_name": { "type": "string", @@ -10,89 +10,88 @@ "pattern": "^[A-Za-z0-9_]+$", "pattern_match_failure_message": "Name must consist of letters, numbers, and underscores." }, - "project_name_short": { - "skip_prompt_if": {}, - "type": "string", - "default": "{{with (regexp \"^(my_)?(.*)(_project|_app|_service)?$\").FindStringSubmatch .project_name}}{{index . 2}}{{else}}{{.project_name}}{{end}}", - "description": "Short name for the project", - "order": 2 - }, "include_job": { + "//": "This property is always set to 'no' for default-minimal", "skip_prompt_if": {}, "type": "string", "default": "no", "enum": ["yes", "no"], - "description": "Include a stub (sample) notebook job", - "order": 3 + "description": "Include a job that runs a notebook", + "order": 2 }, "include_pipeline": { + "//": "This property is always set to 'no' for default-minimal", "skip_prompt_if": {}, "type": "string", "default": "no", "enum": ["yes", "no"], - "description": "Include a stub (sample) Lakeflow Declarative Pipeline", - "order": 4 + "description": "Include an ETL pipeline", + "order": 3 }, "include_python": { + "//": "This property is always set to 'no' for default-minimal", "skip_prompt_if": {}, "type": "string", "default": "no", "enum": ["yes", "no"], - "description": "Include a stub (sample) Python package", - "order": 5 + "description": "Include a sample Python package that builds into a wheel file", + "order": 4 }, "serverless": { + "//": "This property is always set to 'yes' for default-minimal", "skip_prompt_if": {}, "type": "string", "default": "yes", "enum": ["yes", "no"], "description": "Use serverless compute", + "order": 5 + }, + "default_catalog": { + "type": "string", + "default": "{{if eq (default_catalog) \"\"}}hive_metastore{{else}}{{default_catalog}}{{end}}", + "pattern": "^\\w*$", + "pattern_match_failure_message": "Invalid catalog name.", + "description": "Default catalog for any tables created by this project{{if eq (default_catalog) \"\"}} (leave blank when not using Unity Catalog){{end}}", "order": 6 }, - "enable_pydabs": { - "skip_prompt_if": {}, + "personal_schemas": { "type": "string", - "default": "no", - "enum": ["yes", "no"], - "description": "Enable PyDABs", + "description": "Use a personal schema for each user working on this project.\n(This is recommended. Your personal schema will be '{{.default_catalog}}.{{short_name}}'.)", + "default": "yes", + "enum": [ + "yes", + "no, I will customize the schema configuration later in databricks.yml" + ], "order": 7 }, "language": { + "//": "This property is always set to 'python' for default-minimal", "skip_prompt_if": {}, "type": "string", "default": "python", - "enum": ["python", "sql"], - "description": "Language", + "description": "Initial language for this project", + "enum": [ + "python", + "sql" + ], "order": 8 }, "lakeflow_only": { + "//": "This property is always set to 'no' for default-minimal", "skip_prompt_if": {}, "type": "string", "default": "no", - "enum": ["yes", "no"], - "description": "Lakeflow only", + "description": "Internal flag for lakeflow-only templates", "order": 9 }, - "default_catalog": { + "enable_pydabs": { + "//": "This property is always set to 'no' for default-minimal", "skip_prompt_if": {}, "type": "string", - "default": "{{default_catalog}}", - "pattern": "^\\w*$", - "pattern_match_failure_message": "Invalid catalog name.", - "description": "Default catalog for any tables created by this project{{if eq (default_catalog) \"\"}} (leave blank when not using Unity Catalog){{end}}", + "default": "no", + "description": "Use Python instead of YAML for resource definitions", "order": 10 - }, - "personal_schemas": { - "skip_prompt_if": {}, - "type": "string", - "description": "Use a personal schema for each user working on this project\n(this is recommended, your personal schema will be '{{.default_catalog}}.{{short_name}}')", - "default": "yes", - "enum": [ - "yes", - "no (advanced: I will customize the schema configuration later in databricks.yml)" - ], - "order": 11 } }, - "success_message": "Workspace to use (auto-detected, edit in '{{.project_name}}/databricks.yml'): {{workspace_host}}\n\n✨ Your new project has been created in the '{{.project_name}}' directory!\n\nPlease refer to the README.md file for \"getting started\" instructions.\nSee also the documentation at https://docs.databricks.com/dev-tools/bundles/index.html." + "success_message": "\n✨ Your new project has been created in the '{{.project_name}}' directory!\n\nTo get started, refer to the project README.md file and the documentation at https://docs.databricks.com/dev-tools/bundles/index.html." } From 46a3938fed110159e4a6ec486df8a9538963c879 Mon Sep 17 00:00:00 2001 From: Lennart Kats Date: Wed, 5 Nov 2025 15:59:06 +0100 Subject: [PATCH 3/5] Fix pyproject.toml for templates without Python package directories MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit When templates have no Python package directory (include_python=no AND no jobs/pipelines, OR lakeflow_only=yes), hatch cannot automatically detect what to build. This causes 'uv run pytest' to fail with build errors: "Unable to determine which files to ship inside the wheel using the following heuristics: " Add conditional [tool.hatch.build.targets.wheel] configuration with packages = ["src"] when $has_python_package_dir is false. This matches the same logic used in __preamble.tmpl to determine directory structure. The fix applies to: - default-minimal template (no sample code) - lakeflow-pipelines templates (SQL-only, no Python packages) This ensures 'uv run pytest' works correctly in all template variants. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude --- .../output/my_default_minimal/pyproject.toml | 5 ++++- .../python/output/my_lakeflow_pipelines/pyproject.toml | 3 +++ .../sql/output/my_lakeflow_pipelines/pyproject.toml | 3 +++ .../template/{{.project_name}}/pyproject.toml.tmpl | 9 +++++++++ 4 files changed, 19 insertions(+), 1 deletion(-) diff --git a/acceptance/bundle/templates/default-minimal/output/my_default_minimal/pyproject.toml b/acceptance/bundle/templates/default-minimal/output/my_default_minimal/pyproject.toml index f9dcf5c2aa..63f5447cf7 100644 --- a/acceptance/bundle/templates/default-minimal/output/my_default_minimal/pyproject.toml +++ b/acceptance/bundle/templates/default-minimal/output/my_default_minimal/pyproject.toml @@ -8,7 +8,7 @@ dependencies = [ # See also https://docs.databricks.com/dev-tools/bundles/library-dependencies # # LIMITATION: for pipelines, dependencies are cached during development; - # add dependencies to the 'environment' section of pipeline.yml file instead + # add dependencies to the 'environment' section of your pipeline.yml file instead ] [dependency-groups] @@ -25,5 +25,8 @@ main = "my_default_minimal.main:main" requires = ["hatchling"] build-backend = "hatchling.build" +[tool.hatch.build.targets.wheel] +packages = ["src"] + [tool.black] line-length = 125 diff --git a/acceptance/bundle/templates/lakeflow-pipelines/python/output/my_lakeflow_pipelines/pyproject.toml b/acceptance/bundle/templates/lakeflow-pipelines/python/output/my_lakeflow_pipelines/pyproject.toml index c5cf343c0b..9df288afd2 100644 --- a/acceptance/bundle/templates/lakeflow-pipelines/python/output/my_lakeflow_pipelines/pyproject.toml +++ b/acceptance/bundle/templates/lakeflow-pipelines/python/output/my_lakeflow_pipelines/pyproject.toml @@ -25,5 +25,8 @@ main = "my_lakeflow_pipelines.main:main" requires = ["hatchling"] build-backend = "hatchling.build" +[tool.hatch.build.targets.wheel] +packages = ["src"] + [tool.black] line-length = 125 diff --git a/acceptance/bundle/templates/lakeflow-pipelines/sql/output/my_lakeflow_pipelines/pyproject.toml b/acceptance/bundle/templates/lakeflow-pipelines/sql/output/my_lakeflow_pipelines/pyproject.toml index c5cf343c0b..9df288afd2 100644 --- a/acceptance/bundle/templates/lakeflow-pipelines/sql/output/my_lakeflow_pipelines/pyproject.toml +++ b/acceptance/bundle/templates/lakeflow-pipelines/sql/output/my_lakeflow_pipelines/pyproject.toml @@ -25,5 +25,8 @@ main = "my_lakeflow_pipelines.main:main" requires = ["hatchling"] build-backend = "hatchling.build" +[tool.hatch.build.targets.wheel] +packages = ["src"] + [tool.black] line-length = 125 diff --git a/libs/template/templates/default/template/{{.project_name}}/pyproject.toml.tmpl b/libs/template/templates/default/template/{{.project_name}}/pyproject.toml.tmpl index bf8c74c61c..7ca8241ee4 100644 --- a/libs/template/templates/default/template/{{.project_name}}/pyproject.toml.tmpl +++ b/libs/template/templates/default/template/{{.project_name}}/pyproject.toml.tmpl @@ -27,6 +27,15 @@ main = "{{.project_name}}.main:main" [build-system] requires = ["hatchling"] build-backend = "hatchling.build" +{{- $has_python_package_dir := and (or (eq .include_python "yes") (eq .include_job "yes") (eq .include_pipeline "yes")) (not (eq .lakeflow_only "yes")) }} +{{- if not $has_python_package_dir }} +{{- /* Hatch can normally automatically detect the package directory, + * but requires an explicit setting when the package directory is empty. + */}} + +[tool.hatch.build.targets.wheel] +packages = ["src"] +{{- end }} [tool.black] line-length = 125 From e8ab28a526a52d990a1076265190212627ace213 Mon Sep 17 00:00:00 2001 From: Lennart Kats Date: Wed, 5 Nov 2025 21:02:20 +0100 Subject: [PATCH 4/5] Add NEXT_CHANGELOG entry for default-minimal template --- NEXT_CHANGELOG.md | 1 + 1 file changed, 1 insertion(+) diff --git a/NEXT_CHANGELOG.md b/NEXT_CHANGELOG.md index 92780f5207..0d7e7d13e6 100644 --- a/NEXT_CHANGELOG.md +++ b/NEXT_CHANGELOG.md @@ -10,6 +10,7 @@ ### Dependency updates ### Bundles +* Add `default-minimal` template for users who want a clean slate without sample code ([#3885](https://github.com/databricks/cli/pull/3885)) * Updated the default-python template to follow the Lakeflow conventions: pipelines as source files, pyproject.toml ([#3712](https://github.com/databricks/cli/pull/3712)). * Fix a permissions bug adding second IS\_OWNER and causing "The job must have exactly one owner." error. Introduced in 0.274.0. ([#3850](https://github.com/databricks/cli/pull/3850)) From 5ff9f2c6c8e345ec986a7f4c4fdf1cc5bad82b4f Mon Sep 17 00:00:00 2001 From: Lennart Kats Date: Mon, 10 Nov 2025 15:06:25 +0100 Subject: [PATCH 5/5] Add acceptance test check for empty src/ and resources/ directories in default-minimal template MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Addresses PR feedback to verify that the template renderer correctly preserves empty directories without requiring .gitkeep files. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude --- acceptance/bundle/templates/default-minimal/script | 5 +++++ 1 file changed, 5 insertions(+) diff --git a/acceptance/bundle/templates/default-minimal/script b/acceptance/bundle/templates/default-minimal/script index 4a4730d7c2..bce20c6ed9 100644 --- a/acceptance/bundle/templates/default-minimal/script +++ b/acceptance/bundle/templates/default-minimal/script @@ -1,6 +1,11 @@ trace $CLI bundle init default-minimal --config-file ./input.json --output-dir output cd output/my_default_minimal + +# Verify that empty directories are preserved +[ -d "src" ] || exit 1 +[ -d "resources" ] || exit 1 + trace $CLI bundle validate -t dev trace $CLI bundle validate -t prod