Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions NEXT_CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,5 +21,6 @@
* Fix typo in the name of environment variable ([#5158](https://github.com/databricks/terraform-provider-databricks/pull/5158)).
* Export permission assignments on workspace level ([#5169](https://github.com/databricks/terraform-provider-databricks/pull/5169)).
* Added support for Databricks Apps resources ([#5208](https://github.com/databricks/terraform-provider-databricks/pull/5208)).
* Added support for Database Instance resource (aka Lakebase) ([#5212](https://github.com/databricks/terraform-provider-databricks/pull/5212)).

### Internal Changes
2 changes: 2 additions & 0 deletions docs/guides/experimental-exporter.md
Original file line number Diff line number Diff line change
Expand Up @@ -184,6 +184,7 @@ Services could be specified in combination with predefined aliases (`all` - for
* `groups` - **listing** [databricks_group](../data-sources/group.md) with [membership](../resources/group_member.md) and [data access](../resources/group_instance_profile.md). If Identity Federation is enabled on the workspace (when UC Metastore is attached), then account-level groups are exposed as data sources because they are defined on account level, and only workspace-level groups are exposed as resources. See the note above on how to perform migration between workspaces with Identity Federation enabled.
* `idfed` - **listing** [databricks_mws_permission_assignment](../resources/mws_permission_assignment.md) (account-level) and [databricks_permission_assignment](../resources/permission_assignment.md) (workspace-level). When listing is done on account level, you can filter assignment only to specific workspace IDs as specified by `-match`, `-matchRegex`, and `-excludeRegex` options. I.e., to export assignments only for two workspaces, use `-matchRegex '^1688808130562317|5493220389262917$'`.
* `jobs` - **listing** [databricks_job](../resources/job.md). Usually, there are more automated workflows than interactive clusters, so they get their own file in this tool's output. *Please note that workflows deployed and maintained via [Databricks Asset Bundles](https://docs.databricks.com/en/dev-tools/bundles/index.html) aren't exported!*
* `lakebase` - **listing** [databricks_database_instance](../resources/database_instance.md).
* `mlflow-webhooks` - **listing** [databricks_mlflow_webhook](../resources/mlflow_webhook.md).
* `model-serving` - **listing** [databricks_model_serving](../resources/model_serving.md).
* `mounts` - **listing** works only in combination with `-mounts` command-line option.
Expand Down Expand Up @@ -252,6 +253,7 @@ Exporter aims to generate HCL code for most of the resources within the Databric
| [databricks_connection](../resources/connection.md) | Yes | Yes | Yes | No |
| [databricks_credential](../resources/credential.md) | Yes | Yes | Yes | No |
| [databricks_dashboard](../resources/dashboard.md) | Yes | No | Yes | No |
| [databricks_database_instance](../resources/database_instance.md) | Yes | No | Yes | No |
| [databricks_data_quality_monitor](../resources/data_quality_monitor.md) | Yes | Yes | Yes | No |
| [databricks_dbfs_file](../resources/dbfs_file.md) | Yes | No | Yes | No |
| [databricks_external_location](../resources/external_location.md) | Yes | Yes | Yes | No |
Expand Down
40 changes: 40 additions & 0 deletions exporter/AGENTS.md
Original file line number Diff line number Diff line change
Expand Up @@ -84,3 +84,43 @@ unifiedDataToHcl()
**Key Differences**:
- SDKv2 generates nested structures as **blocks**: `evaluation { ... }`
- Plugin Framework generates nested structures as **attributes**: `evaluation = { ... }`

## Helper Functions for Field Omission Logic

### `shouldOmitWithEffectiveFields`

A reusable helper function (`exporter/util.go`) for resources that have input-only fields with corresponding `effective_*` fields. This pattern is common in resources where the API returns `effective_*` versions of input fields (e.g., `effective_node_count` for `node_count`).

**When to Use**:
- Your resource has input-only fields that are not returned by the API
- The API returns corresponding `effective_*` fields with the actual values
- You want to generate HCL with non-zero values from the `effective_*` fields

**Usage**:
```go
"databricks_database_instance": {
// ... other fields ...
ShouldOmitFieldUnified: shouldOmitWithEffectiveFields,
},
```

**How it Works**:
1. Checks if the field has a corresponding `effective_*` field in the schema
2. If found, applies smart filtering:
- Always includes required fields (even if zero value)
- Omits fields with zero values (`false`, `0`, `""`, etc.)
- Omits fields that match their default value
- Includes fields with non-zero values
3. Uses `reflect.ValueOf(v).IsZero()` for proper zero-value detection (important because `wrapper.GetOk()` returns `nonZero=true` even for `false` booleans)

**Prerequisites**:
Your resource's `Import` function must call `copyEffectiveFieldsToInputFieldsWithConverters[TfType](ic, r, GoSdkType{})` to copy values from `effective_*` fields to their input counterparts. See `exporter/impl_lakebase.go` for an example.

**Example**:
For a resource with `node_count` (input-only) and `effective_node_count` (API-returned):
- API returns: `{"effective_node_count": 2, "effective_enable_readable_secondaries": false}`
- Import function copies: `node_count = 2`, `enable_readable_secondaries = false`
- Generated HCL includes: `node_count = 2` (non-zero)
- Generated HCL omits: `enable_readable_secondaries = false` (zero value)

For more details, see `exporter/EFFECTIVE_FIELDS_PATTERN.md`.
123 changes: 123 additions & 0 deletions exporter/abstractions.go
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,12 @@ package exporter
import (
"context"
"fmt"
"log"
"reflect"
"strconv"
"strings"

"github.com/databricks/terraform-provider-databricks/internal/providers/pluginfw/converters"
"github.com/hashicorp/terraform-plugin-framework/attr"
"github.com/hashicorp/terraform-plugin-framework/path"
frameworkschema "github.com/hashicorp/terraform-plugin-framework/resource/schema"
Expand Down Expand Up @@ -717,3 +719,124 @@ func convertGoToPluginFrameworkType(value interface{}) attr.Value {
return types.StringValue(fmt.Sprintf("%v", value))
}
}

// copyEffectiveFieldsToInputFieldsWithConverters automatically copies values from effective_* fields
// to their corresponding input fields (e.g., effective_node_count -> node_count).
// This is useful for Plugin Framework resources where the API returns effective_* fields but doesn't
// return the input fields that were originally set.
//
// NOTE: This function only works with Plugin Framework resources. The effective_* field pattern
// is not used by SDKv2 resources.
//
// This function works by converting the TF state to a Go SDK struct, copying fields
// using reflection, and then converting back to TF state. This approach:
// - Handles complex types (lists, maps, nested objects) automatically via converters
// - Leverages existing converter infrastructure for type safety
// - Works for all field types including custom_tags (lists of objects)
//
// Type parameters:
// - TTF: The Terraform Plugin Framework struct type
// - TGo: The Go SDK struct type
//
// Example usage in an import function:
//
// func importDatabaseInstance(ic *importContext, r *resource) error {
// copyEffectiveFieldsToInputFieldsWithConverters[database_instance_resource.DatabaseInstance](
// ic, r, database.DatabaseInstance{})
// return nil
// }
func copyEffectiveFieldsToInputFieldsWithConverters[TTF any, TGo any](
ic *importContext,
r *resource,
_ TGo,
) {
if r.DataWrapper == nil {
return
}

wrapper := r.DataWrapper
ctx := ic.Context

// Effective fields pattern is only applicable to Plugin Framework resources
if !wrapper.IsPluginFramework() {
log.Printf("[DEBUG] copyEffectiveFieldsToInputFieldsWithConverters called on non-Plugin Framework resource %s, skipping", r.ID)
return
}

// Step 1: Convert TF state to Go SDK struct
var goSdkStruct TGo
var tfStruct TTF
if err := wrapper.GetTypedStruct(ctx, &tfStruct); err != nil {
log.Printf("[WARN] Failed to extract TF struct for %s: %v", r.ID, err)
return
}

diags := converters.TfSdkToGoSdkStruct(ctx, tfStruct, &goSdkStruct)
if diags.HasError() {
log.Printf("[WARN] Failed to convert TF to Go SDK struct for %s: %v", r.ID, diags)
return
}

// Step 2: Copy effective_* fields to their input counterparts using reflection
goSdkValue := reflect.ValueOf(&goSdkStruct).Elem()
goSdkType := goSdkValue.Type()

copiedFields := []string{}
for i := 0; i < goSdkValue.NumField(); i++ {
field := goSdkType.Field(i)
fieldName := field.Name

// Check if this is an effective_* field
if !strings.HasPrefix(fieldName, "Effective") {
continue
}

// Derive the input field name (e.g., "EffectiveNodeCount" -> "NodeCount")
inputFieldName := strings.TrimPrefix(fieldName, "Effective")

// Check if the corresponding input field exists
inputField := goSdkValue.FieldByName(inputFieldName)
if !inputField.IsValid() || !inputField.CanSet() {
continue
}

// Get the effective field value
effectiveField := goSdkValue.Field(i)
if !effectiveField.IsValid() {
continue
}

// Check if types match
if effectiveField.Type() != inputField.Type() {
log.Printf("[DEBUG] Type mismatch for %s: effective=%v, input=%v", inputFieldName, effectiveField.Type(), inputField.Type())
continue
}

// Copy the value
inputField.Set(effectiveField)
copiedFields = append(copiedFields, fmt.Sprintf("%s->%s", fieldName, inputFieldName))
}

if len(copiedFields) > 0 {
log.Printf("[TRACE] Copied effective fields for %s: %s", r.ID, strings.Join(copiedFields, ", "))
}

// Step 3: Convert back to TF state
var tfStruct2 TTF
diags = converters.GoSdkToTfSdkStruct(ctx, goSdkStruct, &tfStruct2)
if diags.HasError() {
log.Printf("[WARN] Failed to convert Go SDK to TF struct for %s: %v", r.ID, diags)
return
}

// Step 4: Write back to the state using Set method on Plugin Framework state
// Access the underlying state from PluginFrameworkResourceData
if pfWrapper, ok := wrapper.(*PluginFrameworkResourceData); ok {
diags := pfWrapper.state.Set(ctx, &tfStruct2)
if diags.HasError() {
log.Printf("[WARN] Failed to write TF struct back to state for %s: %v", r.ID, diags)
}
} else {
log.Printf("[WARN] Unable to write TF struct back to state: wrapper is not PluginFrameworkResourceData for %s", r.ID)
}
}
1 change: 1 addition & 0 deletions exporter/abstractions_test.go
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
package exporter
8 changes: 8 additions & 0 deletions exporter/codegen.go
Original file line number Diff line number Diff line change
Expand Up @@ -399,6 +399,14 @@ func (ic *importContext) extractFieldsForGeneration(imp importable, path []strin
shouldSkip = false
}

// For Plugin Framework, also check for zero values in primitives
if !shouldSkip && wrapper.IsPluginFramework() && nonZero && fieldSchema.IsOptional() {
rv := reflect.ValueOf(raw)
if rv.IsValid() && rv.IsZero() {
shouldSkip = true
}
}

// Check if ShouldGenerateField forces generation
if shouldSkip {
forceGenerate := false
Expand Down
Loading
Loading