You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
## Changes
This merges `CLAUDE.md` into `AGENTS.md`. The former will now be a
symlink, just like `.cursurrules`.
## Review
* This PR was AI-generated. As a human reviewer, I confirmed it just
merges the files and then re-orders the sections in a sensible way. Only
minimal phrasing rewrites were included.
---------
Co-authored-by: google-labs-jules[bot] <161369871+google-labs-jules[bot]@users.noreply.github.com>
This file provides guidance to AI assistants when working with code in this repository.
2
2
3
-
# General
3
+
# Project Overview
4
4
5
-
When moving code from one place to another, please don't unnecessarily change
6
-
the code or omit parts.
5
+
This is the Databricks CLI, a command-line interface for interacting with Databricks workspaces and managing Databricks Assets Bundles (DABs). The project is written in Go and follows a modular architecture.
7
6
8
-
# Style and comments
7
+
# General Rules
9
8
10
-
Please make sure code that you author is consistent with the codebase
11
-
and concise.
9
+
When moving code from one place to another, please don't unnecessarily change the code or omit parts.
10
+
11
+
# Development Commands
12
+
13
+
### Building and Testing
14
+
-`make build` - Build the CLI binary
15
+
-`make test` - Run unit tests for all packages
16
+
-`go test ./acceptance -run TestAccept/bundle/<path>/<to>/<folder> -tail -test.v` - run a single acceptance test
17
+
-`make integration` - Run integration tests (requires environment variables)
18
+
-`make cover` - Generate test coverage reports
19
+
20
+
### Code Quality
21
+
-`make lint` - Run linter on changed files only (uses lintdiff.py)
22
+
-`make lintfull` - Run full linter with fixes (golangci-lint)
23
+
-`make ws` - Run whitespace linter
24
+
-`make fmt` - Format code (Go, Python, YAML)
25
+
-`make checks` - Run quick checks (tidy, whitespace, links)
-`libs/sync/` - File synchronization between local and remote
61
+
62
+
### Key Concepts
63
+
64
+
**Bundles**: Configuration-driven deployments of Databricks resources (jobs, pipelines, etc.). The bundle system uses a mutator pattern where each transformation is a separate, testable component.
65
+
66
+
**Mutators**: Transform bundle configuration through a pipeline. Located in `bundle/config/mutator/` and `bundle/mutator/`. Each mutator implements the `Mutator` interface.
67
+
68
+
**Direct vs Terraform Deployment**: The CLI supports two deployment modes controlled by `DATABRICKS_CLI_DEPLOYMENT` environment variable:
69
+
-`terraform` (default) - Uses Terraform for resource management
70
+
-`direct` - Direct API calls without Terraform
71
+
72
+
# Code Style and Patterns
73
+
74
+
## Go
75
+
76
+
Please make sure code that you author is consistent with the codebase and concise.
12
77
13
78
The code should be self-documenting based on the code and function names.
14
79
@@ -30,17 +95,12 @@ Use modern idiomatic Golang features (version 1.24+). Specifically:
30
95
- Use builtin min() and max() where possible (works on any type and any number of values).
31
96
- Do not capture the for-range variable, since go 1.22 a new copy of the variable is created for each loop iteration.
32
97
33
-
# Commands
34
-
35
-
Use "git rm" to remove and "git mv" to rename files instead of directly modifying files on FS.
36
-
37
-
Do not run “go test ./..." in the root folder as that will start long running integration tests. To test the whole project run "go build && make lint test" in root directory. However, prefer running tests for specific packages instead.
38
-
39
-
If asked to rebase, always prefix each git command with appropriate settings so that it never launches interactive editor.
40
-
GIT_EDITOR=true GIT_SEQUENCE_EDITOR=true VISUAL=true GIT_PAGER=cat git fetch origin main &&
41
-
GIT_EDITOR=true GIT_SEQUENCE_EDITOR=true VISUAL=true GIT_PAGER=cat git rebase origin/main
98
+
### Configuration Patterns
99
+
- Bundle config uses `dyn.Value` for dynamic typing
100
+
- Config loading supports includes, variable interpolation, and target overrides
101
+
- Schema generation is automated from Go struct tags
42
102
43
-
# Python
103
+
##Python
44
104
45
105
When writing Python scripts, we bias for conciseness. We think of Python in this code base as scripts.
46
106
- use Python 3.11
@@ -51,7 +111,12 @@ When writing Python scripts, we bias for conciseness. We think of Python in this
51
111
- After done, format you code with "ruff format -n <path>"
52
112
- Use "#!/usr/bin/env python3" shebang.
53
113
54
-
# Tests
114
+
# Testing
115
+
116
+
### Test Types
117
+
-**Unit tests**: Standard Go tests alongside source files
118
+
-**Integration tests**: `integration/` directory, requires live Databricks workspace
119
+
-**Acceptance tests**: `acceptance/` directory, uses mock HTTP server
55
120
56
121
Each file like process_target_mode_test.go should have a corresponding test file
57
122
like process_target_mode_test.go. If you add new functionality to a file,
@@ -82,7 +147,40 @@ Notice that:
82
147
When writing tests, please don't include an explanation in each
83
148
test case in your responses. I am just interested in the tests.
84
149
85
-
# databricks_template_schema.json
150
+
### Acceptance Tests
151
+
- Located in `acceptance/` with nested directory structure
152
+
- Each test directory contains `databricks.yml`, `script`, and `output.txt`
153
+
- Run with `go test ./acceptance -run TestAccept/bundle/<path>/<to>/<folder> -tail -test.v`
154
+
- Use `-update` flag to regenerate expected output files
155
+
- When you see the test fails because it has an old output, just run it one more time with an `-update` flag instead of changing the `output.txt` directly
156
+
157
+
# Logging
158
+
159
+
Use the following for logging:
160
+
161
+
```
162
+
import "github.com/databricks/cli/libs/log"
163
+
164
+
log.Infof(ctx, "...")
165
+
log.Debugf(ctx, "...")
166
+
log.Warnf(ctx, "...")
167
+
log.Errorf(ctx, "...")
168
+
```
169
+
170
+
Note that the 'ctx' variable here is something that should be passed in as
171
+
an argument by the caller. We should not use context.Background() like we do in tests.
172
+
173
+
Use cmdio.LogString to print to stdout:
174
+
175
+
```
176
+
import "github.com/databricks/cli/libs/cmdio"
177
+
178
+
cmdio.LogString(ctx, "...")
179
+
```
180
+
181
+
# Specific File Guides
182
+
183
+
## databricks_template_schema.json
86
184
87
185
A databricks_template_schema.json file is used to configure bundle templates.
88
186
@@ -151,26 +249,9 @@ Notice that:
151
249
- Helpers such as {{default_catalog}} and {{short_name}} can be used within property descriptors.
152
250
- Properties can be referenced in messages and descriptions using {{.property_name}}. {{.project_name}} is an example.
153
251
154
-
# Logging and output to the terminal
155
-
156
-
Use the following for logging:
157
-
158
-
```
159
-
import "github.com/databricks/cli/libs/log"
160
-
161
-
log.Infof(ctx, "...")
162
-
log.Debugf(ctx, "...")
163
-
log.Warnf(ctx, "...")
164
-
log.Errorf(ctx, "...")
165
-
```
252
+
# Development Tips
166
253
167
-
Note that the 'ctx' variable here is something that should be passed in as
168
-
an argument by the caller. We should not use context.Background() like we do in tests.
169
-
170
-
Use cmdio.LogString to print to stdout:
171
-
172
-
```
173
-
import "github.com/databricks/cli/libs/cmdio"
174
-
175
-
cmdio.LogString(ctx, "...")
176
-
```
254
+
- Run `make checks fmt lint` before committing
255
+
- Use `make test-update` to regenerate acceptance test outputs after changes
256
+
- The CLI binary supports both `databricks` and `pipelines` command modes based on executable name
257
+
- Resource definitions in `bundle/config/resources/` are auto-generated from OpenAPI specs
0 commit comments