Skip to content

Commit c90a6e2

Browse files
authored
Add CLAUDE.md file (#3489)
## Why <!-- Why are these changes needed? Provide the context that the reviewer might be missing. For example, were there any decisions behind the change that are not reflected in the code itself? --> Following best practices listed [here](https://www.anthropic.com/engineering/claude-code-best-practices): - ran /init - made some small changes to the initial content
1 parent 5d7b275 commit c90a6e2

File tree

1 file changed

+89
-0
lines changed

1 file changed

+89
-0
lines changed

CLAUDE.md

Lines changed: 89 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,89 @@
1+
# CLAUDE.md
2+
3+
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
4+
5+
## Project Overview
6+
7+
This is the Databricks CLI, a command-line interface for interacting with Databricks workspaces and managing Databricks Assets Bundles (DABs). The project is written in Go and follows a modular architecture.
8+
9+
## Development Commands
10+
11+
### Building and Testing
12+
- `make build` - Build the CLI binary
13+
- `make test` - Run unit tests for all packages
14+
- `go test ./acceptance -run TestAccept/bundle/<path>/<to>/<folder> -tail -test.v` - run a single acceptance test
15+
- `make integration` - Run integration tests (requires environment variables)
16+
- `make cover` - Generate test coverage reports
17+
18+
### Code Quality
19+
- `make lint` - Run linter on changed files only (uses lintdiff.py)
20+
- `make lintfull` - Run full linter with fixes (golangci-lint)
21+
- `make ws` - Run whitespace linter
22+
- `make fmt` - Format code (Go, Python, YAML)
23+
- `make checks` - Run quick checks (tidy, whitespace, links)
24+
25+
### Specialized Commands
26+
- `make schema` - Generate bundle JSON schema
27+
- `make docs` - Generate bundle documentation
28+
- `make generate` - Generate CLI code from OpenAPI spec (requires universe repo)
29+
30+
## Architecture
31+
32+
### Core Components
33+
34+
**cmd/** - CLI command structure using Cobra framework
35+
- `cmd/cmd.go` - Main command setup and subcommand registration
36+
- `cmd/bundle/` - Bundle-related commands (deploy, validate, etc.)
37+
- `cmd/workspace/` - Workspace API commands (auto-generated)
38+
- `cmd/account/` - Account-level API commands (auto-generated)
39+
40+
**bundle/** - Core bundle functionality for Databricks Asset Bundles
41+
- `bundle/bundle.go` - Main Bundle struct and lifecycle management
42+
- `bundle/config/` - Configuration loading, validation, and schema
43+
- `bundle/deploy/` - Deployment logic (Terraform and direct modes)
44+
- `bundle/mutator/` - Configuration transformation pipeline
45+
- `bundle/phases/` - High-level deployment phases
46+
47+
**libs/** - Shared libraries and utilities
48+
- `libs/dyn/` - Dynamic configuration value manipulation
49+
- `libs/filer/` - File system abstraction (local, DBFS, workspace)
50+
- `libs/auth/` - Databricks authentication handling
51+
- `libs/sync/` - File synchronization between local and remote
52+
53+
### Key Concepts
54+
55+
**Bundles**: Configuration-driven deployments of Databricks resources (jobs, pipelines, etc.). The bundle system uses a mutator pattern where each transformation is a separate, testable component.
56+
57+
**Mutators**: Transform bundle configuration through a pipeline. Located in `bundle/config/mutator/` and `bundle/mutator/`. Each mutator implements the `Mutator` interface.
58+
59+
**Direct vs Terraform Deployment**: The CLI supports two deployment modes controlled by `DATABRICKS_CLI_DEPLOYMENT` environment variable:
60+
- `terraform` (default) - Uses Terraform for resource management
61+
- `direct` - Direct API calls without Terraform
62+
63+
## Testing
64+
65+
### Test Types
66+
- **Unit tests**: Standard Go tests alongside source files
67+
- **Integration tests**: `integration/` directory, requires live Databricks workspace
68+
- **Acceptance tests**: `acceptance/` directory, uses mock HTTP server
69+
70+
### Acceptance Tests
71+
- Located in `acceptance/` with nested directory structure
72+
- Each test directory contains `databricks.yml`, `script`, and `output.txt`
73+
- Run with `go test ./acceptance -run TestAccept/bundle/<path>/<to>/<folder> -tail -test.v`
74+
- Use `-update` flag to regenerate expected output files
75+
- When you see the test fails because it has an old output, just run it one more time with an `-update` flag instead of changing the `output.txt` directly
76+
77+
## Code Patterns
78+
79+
### Configuration
80+
- Bundle config uses `dyn.Value` for dynamic typing
81+
- Config loading supports includes, variable interpolation, and target overrides
82+
- Schema generation is automated from Go struct tags
83+
84+
## Development Tips
85+
86+
- Run `make checks fmt lint` before committing
87+
- Use `make test-update` to regenerate acceptance test outputs after changes
88+
- The CLI binary supports both `databricks` and `pipelines` command modes based on executable name
89+
- Resource definitions in `bundle/config/resources/` are auto-generated from OpenAPI specs

0 commit comments

Comments
 (0)