22
33The 'default_python' project was generated by using the default-python template.
44
5- For documentation on the Databricks Asset Bundles format use for this project,
6- and for CI/CD configuration, see https://docs.databricks.com/aws/en/dev-tools/bundles .
7-
85## Getting started
96
10- Choose how you want to work on this project:
11-
12- (a) Directly in your Databricks workspace, see
13- https://docs.databricks.com/dev-tools/bundles/workspace .
14-
15- (b) Locally with an IDE like Cursor or VS Code, see
16- https://docs.databricks.com/vscode-ext .
17-
18- (c) With command line tools, see https://docs.databricks.com/dev-tools/cli/databricks-cli.html
19-
20-
21- Dependencies for this project should be installed using UV:
7+ 0 . Install UV: https://docs.astral.sh/uv/getting-started/installation/
228
23- * Make sure you have the UV package manager installed.
24- It's an alternative to tools like pip: https://docs.astral.sh/uv/getting-started/installation/ .
25- * Run ` uv sync --dev ` to install the project's dependencies.
9+ 1 . Install the Databricks CLI from https://docs.databricks.com/dev-tools/cli/databricks-cli.html
2610
27- # Using this project using the CLI
28-
29- The Databricks workspace and IDE extensions provide a graphical interface for working
30- with this project. It's also possible to interact with it directly using the CLI:
31-
32- 1 . Authenticate to your Databricks workspace, if you have not done so already:
11+ 2 . Authenticate to your Databricks workspace, if you have not done so already:
3312 ```
3413 $ databricks configure
3514 ```
3615
37- 2 . To deploy a development copy of this project, type:
16+ 3 . To deploy a development copy of this project, type:
3817 ```
3918 $ databricks bundle deploy --target dev
4019 ```
@@ -44,9 +23,9 @@ with this project. It's also possible to interact with it directly using the CLI
4423 This deploys everything that's defined for this project.
4524 For example, the default template would deploy a job called
4625 `[dev yourname] default_python_job` to your workspace.
47- You can find that job by opening your workpace and clicking on **Jobs & Pipelines **.
26+ You can find that job by opening your workpace and clicking on **Workflows **.
4827
49- 3 . Similarly, to deploy a production copy, type:
28+ 4 . Similarly, to deploy a production copy, type:
5029 ```
5130 $ databricks bundle deploy --target prod
5231 ```
@@ -56,12 +35,17 @@ with this project. It's also possible to interact with it directly using the CLI
5635 is paused when deploying in development mode (see
5736 https://docs.databricks.com/dev-tools/bundles/deployment-modes.html).
5837
59- 4 . To run a job or pipeline, use the "run" command:
38+ 5 . To run a job or pipeline, use the "run" command:
6039 ```
6140 $ databricks bundle run
6241 ```
63-
64- 5. Finally, to run tests locally, use `pytest`:
65- ```
66- $ uv run pytest
67- ```
42+ 6. Optionally, install the Databricks extension for Visual Studio code for local development from
43+ https://docs.databricks.com/dev-tools/vscode-ext.html. It can configure your
44+ virtual environment and setup Databricks Connect for running unit tests locally.
45+ When not using these tools, consult your development environment's documentation
46+ and/or the documentation for Databricks Connect for manually setting up your environment
47+ (https://docs.databricks.com/en/dev-tools/databricks-connect/python/index.html).
48+
49+ 7. For documentation on the Databricks asset bundles format used
50+ for this project, and for CI/CD configuration, see
51+ https://docs.databricks.com/dev-tools/bundles/index.html.
0 commit comments