|
| 1 | +# AGENTS.md |
| 2 | + |
| 3 | +This file provides guidance to LLM-based agents when working with code in this repository. |
| 4 | + |
| 5 | +## Common Development Commands |
| 6 | + |
| 7 | +### Building and Testing |
| 8 | +- `make build` - Build the provider binary |
| 9 | +- `make install` - Build and install the provider locally for Terraform development |
| 10 | +- `make test` - Run unit tests with linting |
| 11 | +- `make lint` - Run linting with staticcheck (always use this instead of running staticcheck directly) |
| 12 | + |
| 13 | +### Code Quality |
| 14 | +- `make fmt` - Format Go code with goimports and gofmt |
| 15 | +- `make fmt-docs` - Format code samples in documentation (requires terrafmt) |
| 16 | +- `make ws` - Validate whitespace in files |
| 17 | + |
| 18 | +### Development Helpers |
| 19 | +- `make vendor` - Populate vendor directory with dependencies |
| 20 | +- `make schema` - Print provider schema |
| 21 | +- `make diff-schema` - Compare current schema with previous version (useful for migration verification) |
| 22 | + |
| 23 | +### Single Test Execution |
| 24 | +For unit tests: |
| 25 | +```bash |
| 26 | +go test -v -run TestSpecificTest ./path/to/package |
| 27 | +``` |
| 28 | + |
| 29 | +For integration tests: |
| 30 | +- First, load the environment to use from `~/.databricks/debug-env.json`. The keys in this file are environments, and the values are |
| 31 | + maps of environment variable names to values. Based on the *Level test function used, load the appropriate environment: |
| 32 | + - WorkspaceLevel: "workspace" |
| 33 | + - AccountLevel: "account" |
| 34 | + - UnityWorkspaceLevel: "ucws" |
| 35 | + - UnityAccountLevel: "ucacct" |
| 36 | +- Then, run the test. |
| 37 | +``` |
| 38 | +go test -v -run TestAccResourceName ./path/to/package |
| 39 | +``` |
| 40 | + |
| 41 | +## Code Architecture |
| 42 | + |
| 43 | +### Dual Provider Implementation |
| 44 | +This codebase implements two Terraform provider architectures that are muxed together: |
| 45 | +- **SDKv2 Provider**: Legacy implementation in root directories (e.g., `catalog/`, `clusters/`, `jobs/`) |
| 46 | +- **Plugin Framework Provider**: New implementation in `internal/providers/pluginfw/products/` |
| 47 | + |
| 48 | +### Key Architecture Components |
| 49 | + |
| 50 | +#### Provider Structure (`internal/providers/`) |
| 51 | +- `providers.go` - Main provider muxing logic combining SDKv2 and Plugin Framework |
| 52 | +- `sdkv2/` - SDKv2-specific provider implementation |
| 53 | +- `pluginfw/` - Plugin Framework provider implementation with auto-generated schemas |
| 54 | +- `common/` - Shared utilities between both providers |
| 55 | + |
| 56 | +#### Service Models (`internal/service/`) |
| 57 | +Auto-generated Go structs from Databricks SDK: |
| 58 | +- `*_tf/model.go` - Current Plugin Framework compatible structs |
| 59 | +- `*_tf/legacy_model.go` - SDKv2 compatible structs with `_SdkV2` suffix |
| 60 | + |
| 61 | +#### Resource Organization |
| 62 | +- **Root directories** (e.g., `catalog/`, `jobs/`, `clusters/`): SDKv2 resources and data sources |
| 63 | +- **`internal/providers/pluginfw/products/`**: Plugin Framework resources organized by service |
| 64 | + |
| 65 | +### Migration Pattern |
| 66 | +Resources are being migrated from SDKv2 to Plugin Framework. When migrating: |
| 67 | +1. Use `_SdkV2` suffixed structs from `internal/service/` for schema compatibility |
| 68 | +2. Call `cs.ConfigureAsSdkV2Compatible()` in schema definition |
| 69 | +3. Ensure no schema breaking changes with `make diff-schema` |
| 70 | + |
| 71 | +### Resource Development Patterns |
| 72 | + |
| 73 | +#### Adding SDKv2 Resources |
| 74 | +1. Create resource file in appropriate root directory (e.g., `catalog/resource_new_thing.go`) |
| 75 | +2. Use `common.Resource{}` helper with auto-generated schema from struct tags |
| 76 | +3. Add to provider in `providers/sdkv2/sdkv2.go` |
| 77 | + |
| 78 | +#### Adding Plugin Framework Resources |
| 79 | +1. Create in `internal/providers/pluginfw/products/{service}/` |
| 80 | +2. Use `ResourceStructToSchema()` with structs from `internal/service/{service}_tf/` |
| 81 | +3. Implement required interfaces (`ResourceWithConfigure`, etc.) |
| 82 | +4. Add to `internal/providers/pluginfw/pluginfw.go` |
| 83 | + |
| 84 | +### Client Architecture |
| 85 | +- `common.DatabricksClient` - Core client wrapper |
| 86 | +- Access workspace client via `client.GetWorkspaceClient()` |
| 87 | +- Access account client via `client.GetAccountClient()` |
| 88 | +- Client automatically handles authentication and retries |
| 89 | + |
| 90 | +### Testing Structure |
| 91 | +- Unit tests: `*_test.go` files using `qa.ResourceFixture` for HTTP mocking |
| 92 | +- Integration tests: `*_acc_test.go` files with live API testing |
| 93 | +- Test naming conventions determine environment: |
| 94 | + - `TestAcc*` - Workspace-level tests across all clouds |
| 95 | + - `TestMwsAcc*` - Account-level tests across all clouds |
| 96 | + - `TestUcAcc*` - Unity Catalog tests across all clouds |
| 97 | + |
| 98 | +### Dual-Provider Resource Patterns |
| 99 | + |
| 100 | +#### Import Handling for Account/Workspace Resources |
| 101 | +Some Unity Catalog resources (e.g., storage credentials, external locations) work with both account-level and workspace-level providers but require different ID formats for imports: |
| 102 | + |
| 103 | +**ID Parsing Pattern**: Implement a `parse{ResourceName}Id()` function that: |
| 104 | +1. Splits composite IDs on "|" delimiter (format: `metastore_id|resource_name`) |
| 105 | +2. For account-level providers: extracts metastore_id, sets it in state, updates resource ID to simple name |
| 106 | +3. For workspace-level providers: uses the ID as-is (simple name) |
| 107 | +4. Returns parsed components and any validation errors |
| 108 | + |
| 109 | +**CRUD Method Consistency**: All CRUD methods (Create, Read, Update, Delete) must use the same ID parsing logic to ensure consistent behavior across operations. |
| 110 | + |
| 111 | +**Testing Import Functionality**: Use `qa.ResourceFixture` with: |
| 112 | +- `Read: true` to test import behavior |
| 113 | +- Test both valid composite ID format and simple name format |
| 114 | +- Test error conditions with exact error message matching |
| 115 | +- Example: `"metastore123|my-credential"` and `"my-credential"` |
| 116 | + |
| 117 | +**Documentation Pattern**: When documenting import formats, specify "when using a workspace-level/account-level provider" to clarify it's about provider configuration, not resource classification. |
| 118 | + |
| 119 | +## Development Guidelines |
| 120 | + |
| 121 | +### Code Organization |
| 122 | +- Files should not exceed 600 lines |
| 123 | +- Functions should fit on a 13" screen (max 40 lines, except tests) |
| 124 | +- No unnecessary package exports (avoid public structs/types unless needed outside package) |
| 125 | +- Use `qa.EnvironmentTemplate()` instead of complex `fmt.Sprintf` with >4 placeholders |
| 126 | + |
| 127 | +### Import Conventions |
| 128 | +Order imports as: Go standard library, vendor packages, current provider packages. |
| 129 | +Within each section, maintain alphabetical order. |
| 130 | + |
| 131 | +### Documentation |
| 132 | +- All resources and data sources require Terraform Registry compatible documentation in `docs/` |
| 133 | +- Code samples must be formatted with `make fmt-docs` |
| 134 | +- Cross-link integrity between markdown files is required |
| 135 | +- Use Terraform Registry Doc Preview Tool for validation |
| 136 | + |
| 137 | +### Changelog Requirements |
| 138 | +All user-facing changes must be documented in `NEXT_CHANGELOG.md` with format: |
| 139 | +``` |
| 140 | +* <Summary of change> ([#<PR number>](<PR link>)). |
| 141 | +
|
| 142 | + <Optional additional information> |
| 143 | +``` |
| 144 | + |
| 145 | +### Migration Verification |
| 146 | +When migrating resources to Plugin Framework, always run `make diff-schema` to ensure no breaking changes to the Terraform schema. |
| 147 | + |
| 148 | +- Always run make fmt before making any commit. |
0 commit comments