Skip to content

Commit 4229556

Browse files
authored
Reuse 'default' in pydabs template (#3826)
## Changes Base `pydabs` template on `default` template instead of `default-python` ## Why We are switching from the `default-python` template to one based on `default`. ## Tests Existing acceptance tests
1 parent 6576d8e commit 4229556

File tree

64 files changed

+519
-413
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

64 files changed

+519
-413
lines changed

.codegen.json

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,8 +7,8 @@
77
"experimental/python/databricks/bundles/version.py": "__version__ = \"$VERSION\"",
88
"experimental/python/pyproject.toml": "version = \"$VERSION\"",
99
"experimental/python/uv.lock": "name = \"databricks-bundles\"\nversion = \"$VERSION\"",
10-
"libs/template/templates/experimental-jobs-as-code/library/versions.tmpl": "{{define \"latest_databricks_bundles_version\" -}}$VERSION{{- end}}"
11-
"libs/template/templates/default-python/library/versions.tmpl": "{{define \"latest_databricks_bundles_version\" -}}$VERSION{{- end}}"
10+
"libs/template/templates/experimental-jobs-as-code/library/versions.tmpl": "{{define \"latest_databricks_bundles_version\" -}}$VERSION{{- end}}",
11+
"libs/template/templates/default/library/versions.tmpl": "{{define \"latest_databricks_bundles_version\" -}}$VERSION{{- end}}"
1212
},
1313
"toolchain": {
1414
"required": [

acceptance/bundle/templates/lakeflow-pipelines/python/output/my_lakeflow_pipelines/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -36,7 +36,7 @@ with this project. It's also possible to interact with it directly using the CLI
3636
3737
This deploys everything that's defined for this project.
3838
For example, the default template would deploy a pipeline called
39-
`[dev yourname] lakeflow_pipelines_etl` to your workspace.
39+
`[dev yourname] my_lakeflow_pipelines_etl` to your workspace.
4040
You can find that resource by opening your workpace and clicking on **Jobs & Pipelines**.
4141
4242
3. Similarly, to deploy a production copy, type:

acceptance/bundle/templates/lakeflow-pipelines/python/output/my_lakeflow_pipelines/resources/lakeflow_pipelines_etl.pipeline.yml renamed to acceptance/bundle/templates/lakeflow-pipelines/python/output/my_lakeflow_pipelines/resources/my_lakeflow_pipelines_etl.pipeline.yml

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -2,17 +2,17 @@
22

33
resources:
44
pipelines:
5-
lakeflow_pipelines_etl:
6-
name: lakeflow_pipelines_etl
5+
my_lakeflow_pipelines_etl:
6+
name: my_lakeflow_pipelines_etl
77
## Catalog is required for serverless compute
88
catalog: ${var.catalog}
99
schema: ${var.schema}
1010
serverless: true
11-
root_path: "../src/lakeflow_pipelines_etl"
11+
root_path: "../src/my_lakeflow_pipelines_etl"
1212

1313
libraries:
1414
- glob:
15-
include: ../src/lakeflow_pipelines_etl/transformations/**
15+
include: ../src/my_lakeflow_pipelines_etl/transformations/**
1616

1717
environment:
1818
dependencies:

acceptance/bundle/templates/lakeflow-pipelines/python/output/my_lakeflow_pipelines/resources/sample_job.job.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ resources:
2424
tasks:
2525
- task_key: refresh_pipeline
2626
pipeline_task:
27-
pipeline_id: ${resources.pipelines.lakeflow_pipelines_etl.id}
27+
pipeline_id: ${resources.pipelines.my_lakeflow_pipelines_etl.id}
2828

2929
environments:
3030
- environment_key: default

acceptance/bundle/templates/lakeflow-pipelines/python/output/my_lakeflow_pipelines/src/lakeflow_pipelines_etl/README.md renamed to acceptance/bundle/templates/lakeflow-pipelines/python/output/my_lakeflow_pipelines/src/my_lakeflow_pipelines_etl/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,6 @@ To get started, go to the `transformations` folder -- most of the relevant sourc
1515
* Take a look at the sample called "sample_trips_my_lakeflow_pipelines.py" to get familiar with the syntax.
1616
Read more about the syntax at https://docs.databricks.com/dlt/python-ref.html.
1717
* If you're using the workspace UI, use `Run file` to run and preview a single transformation.
18-
* If you're using the CLI, use `databricks bundle run lakeflow_pipelines_etl --select sample_trips_my_lakeflow_pipelines` to run a single transformation.
18+
* If you're using the CLI, use `databricks bundle run my_lakeflow_pipelines_etl --select sample_trips_my_lakeflow_pipelines` to run a single transformation.
1919

2020
For more tutorials and reference material, see https://docs.databricks.com/dlt.

acceptance/bundle/templates/lakeflow-pipelines/python/output/my_lakeflow_pipelines/src/lakeflow_pipelines_etl/explorations/sample_exploration.ipynb renamed to acceptance/bundle/templates/lakeflow-pipelines/python/output/my_lakeflow_pipelines/src/my_lakeflow_pipelines_etl/explorations/sample_exploration.ipynb

File renamed without changes.

acceptance/bundle/templates/lakeflow-pipelines/python/output/my_lakeflow_pipelines/src/lakeflow_pipelines_etl/transformations/sample_trips_my_lakeflow_pipelines.py renamed to acceptance/bundle/templates/lakeflow-pipelines/python/output/my_lakeflow_pipelines/src/my_lakeflow_pipelines_etl/transformations/sample_trips_my_lakeflow_pipelines.py

File renamed without changes.

acceptance/bundle/templates/lakeflow-pipelines/python/output/my_lakeflow_pipelines/src/lakeflow_pipelines_etl/transformations/sample_zones_my_lakeflow_pipelines.py renamed to acceptance/bundle/templates/lakeflow-pipelines/python/output/my_lakeflow_pipelines/src/my_lakeflow_pipelines_etl/transformations/sample_zones_my_lakeflow_pipelines.py

File renamed without changes.

acceptance/bundle/templates/lakeflow-pipelines/sql/output/my_lakeflow_pipelines/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -36,7 +36,7 @@ with this project. It's also possible to interact with it directly using the CLI
3636
3737
This deploys everything that's defined for this project.
3838
For example, the default template would deploy a pipeline called
39-
`[dev yourname] lakeflow_pipelines_etl` to your workspace.
39+
`[dev yourname] my_lakeflow_pipelines_etl` to your workspace.
4040
You can find that resource by opening your workpace and clicking on **Jobs & Pipelines**.
4141
4242
3. Similarly, to deploy a production copy, type:

acceptance/bundle/templates/lakeflow-pipelines/sql/output/my_lakeflow_pipelines/resources/lakeflow_pipelines_etl.pipeline.yml renamed to acceptance/bundle/templates/lakeflow-pipelines/sql/output/my_lakeflow_pipelines/resources/my_lakeflow_pipelines_etl.pipeline.yml

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -2,17 +2,17 @@
22

33
resources:
44
pipelines:
5-
lakeflow_pipelines_etl:
6-
name: lakeflow_pipelines_etl
5+
my_lakeflow_pipelines_etl:
6+
name: my_lakeflow_pipelines_etl
77
## Catalog is required for serverless compute
88
catalog: ${var.catalog}
99
schema: ${var.schema}
1010
serverless: true
11-
root_path: "../src/lakeflow_pipelines_etl"
11+
root_path: "../src/my_lakeflow_pipelines_etl"
1212

1313
libraries:
1414
- glob:
15-
include: ../src/lakeflow_pipelines_etl/transformations/**
15+
include: ../src/my_lakeflow_pipelines_etl/transformations/**
1616

1717
environment:
1818
dependencies:

0 commit comments

Comments
 (0)