Skip to content

Commit e72c61c

Browse files
authored
[Internal] Use SDKv2 Implementation for databricks_share resource as default (#4931)
## Changes <!-- Summary of your changes that are easy to understand --> This PR: - reverts #4846. - Adds a note under breaking changes to highlight the issue. - Move the next release to patch. **Context** databricks_share resource exists in 2 implementations SDKv2 and Plugin framework. In #4846, plugin framework implementation (`databricks_share_pluginframework`) was made default (`databricks_share`) but this lead to surfacing an underlying issue in plugin framework implementation leading to panic: #4913 ``` Stack trace from the terraform-provider-databricks_v1.86.0 plugin: panic: runtime error: index out of range [0] with length 0 goroutine 68 [running]: github.com/databricks/terraform-provider-databricks/internal/providers/pluginfw/products/sharing.(*ShareResource).syncEffectiveFields(_, {_, _}, {{{0x2, {0xc0007f6b40, 0x22}}, {0x2, 0x196f67034f0}, {0x2, {0xc0007f6b70, ...}}, ...}}, ...) /home/runner/work/terraform-provider-databricks/terraform-provider-databricks/internal/providers/pluginfw/products/sharing/resource_share.go:449 +0x48e github.com/databricks/terraform-provider-databricks/internal/providers/pluginfw/products/sharing.(*ShareResource).Read(0xc000527240, {0x1c50398?, 0xc000a31890?}, {{{{0x1c5b670, 0xc000a4c150}, {0x16fd920, 0xc000a41e30}}, {0x1c61390, 0xc0009b1ef0}}, 0x0, ...}, ...) /home/runner/work/terraform-provider-databricks/terraform-provider-databricks/internal/providers/pluginfw/products/sharing/resource_share.go:281 +0x765 ``` This leads to the default share resource having unstable implementation. Panic can lead to bad state in TF since apply is stopped in between. This change would be followed up by a patch release. - No changes for users on version 1.85.0 and below which would upgrade to patch release including this fix. Limit the impacted users by moving back to SDKv2. Impacted users with the issue of panic seems to have been able to mitigate the issue by using older version of TF which uses SDKv2. However, I am getting diff to create the share again. It seems like SDKv2 is not able to use the state create by plugin framework. This needs further investigation. To limit the impact of issue, we are rolling back to the previous stable version. ## Tests <!-- How is this tested? Please see the checklist below and also describe any other relevant tests --> Tests same as when SDKv2 was default.
1 parent 5615eba commit e72c61c

File tree

9 files changed

+23
-469
lines changed

9 files changed

+23
-469
lines changed

NEXT_CHANGELOG.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,9 @@
11
# NEXT CHANGELOG
22

3-
## Release v1.88.0
3+
## Release v1.87.1
44

55
### Breaking Changes
6+
Terraform databricks provider version 1.86.0 introduced changes to databricks_share resource leading to panic for some users while applying terraform. We are rolling back to SDKv2 implementation in this version. No change is expected for users who are upgrading to latest version from before 1.86.0. Users on 1.86.0 and not facing issues are suggested to not upgrade their version to this patch release. We are investigating the issue and will release a fix soon.
67

78
### New Features and Improvements
89

@@ -13,3 +14,4 @@
1314
### Exporter
1415

1516
### Internal Changes
17+
* Use SDKv2 Implementation for databricks_share resource as default ([#4931](https://github.com/databricks/terraform-provider-databricks/pull/4931))

exporter/importables.go

Lines changed: 2 additions & 35 deletions
Original file line numberDiff line numberDiff line change
@@ -1992,42 +1992,8 @@ var resourcesMap map[string]importable = map[string]importable{
19921992
return nil
19931993
},
19941994
Import: func(ic *importContext, r *resource) error {
1995-
resourceInfo := ic.Resources["databricks_share"]
1996-
if resourceInfo == nil {
1997-
// Fallback to direct data access if schema is not available
1998-
objectsList := r.Data.Get("object").([]any)
1999-
ic.emitUCGrantsWithOwner("share/"+r.ID, r)
2000-
for _, objRaw := range objectsList {
2001-
obj := objRaw.(map[string]any)
2002-
dataObjectType := obj["data_object_type"].(string)
2003-
name := obj["name"].(string)
2004-
2005-
switch dataObjectType {
2006-
case "TABLE":
2007-
ic.Emit(&resource{
2008-
Resource: "databricks_sql_table",
2009-
ID: name,
2010-
})
2011-
case "VOLUME":
2012-
ic.Emit(&resource{
2013-
Resource: "databricks_volume",
2014-
ID: name,
2015-
})
2016-
case "MODEL":
2017-
ic.Emit(&resource{
2018-
Resource: "databricks_registered_model",
2019-
ID: name,
2020-
})
2021-
default:
2022-
log.Printf("[INFO] Object type '%s' (name: '%s') isn't supported in share '%s'",
2023-
dataObjectType, name, r.ID)
2024-
}
2025-
}
2026-
return nil
2027-
}
2028-
20291995
var share tf_sharing.ShareInfo
2030-
s := resourceInfo.Schema
1996+
s := ic.Resources["databricks_share"].Schema
20311997
common.DataToStructPointer(r.Data, s, &share)
20321998
// TODO: how to link recipients to share?
20331999
ic.emitUCGrantsWithOwner("share/"+r.ID, r)
@@ -2053,6 +2019,7 @@ var resourcesMap map[string]importable = map[string]importable{
20532019
obj.DataObjectType, obj.Name, r.ID)
20542020
}
20552021
}
2022+
20562023
return nil
20572024
},
20582025
ShouldOmitField: shouldOmitForUnityCatalog,

internal/providers/pluginfw/pluginfw_rollout_utils.go

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -33,14 +33,11 @@ import (
3333
var migratedResources = []func() resource.Resource{
3434
library.ResourceLibrary,
3535
qualitymonitor.ResourceQualityMonitor,
36-
sharing.ResourceShare,
3736
}
3837

3938
// List of data sources that have been migrated from SDK V2 to plugin framework
4039
// Keep this list sorted.
4140
var migratedDataSources = []func() datasource.DataSource{
42-
sharing.DataSourceShare,
43-
sharing.DataSourceShares,
4441
volume.DataSourceVolumes,
4542
}
4643

@@ -49,6 +46,7 @@ var migratedDataSources = []func() datasource.DataSource{
4946
var pluginFwOnlyResources = append(
5047
[]func() resource.Resource{
5148
app.ResourceApp,
49+
sharing.ResourceShare,
5250
},
5351
autoGeneratedResources...,
5452
)
@@ -67,6 +65,8 @@ var pluginFwOnlyDataSources = append(
6765
serving.DataSourceServingEndpoints,
6866
// TODO: Add DataSourceCluster into migratedDataSources after fixing unit tests.
6967
cluster.DataSourceCluster, // Using the staging name (with pluginframework suffix)
68+
sharing.DataSourceShare, // Using the staging name (with pluginframework suffix)
69+
sharing.DataSourceShares, // Using the staging name (with pluginframework suffix)
7070
},
7171
autoGeneratedDataSources...,
7272
)

internal/providers/pluginfw/products/sharing/data_share.go

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ type ShareDataSource struct {
2828
}
2929

3030
func (d *ShareDataSource) Metadata(ctx context.Context, req datasource.MetadataRequest, resp *datasource.MetadataResponse) {
31-
resp.TypeName = pluginfwcommon.GetDatabricksProductionName(dataSourceNameShare)
31+
resp.TypeName = pluginfwcommon.GetDatabricksStagingName(dataSourceNameShare)
3232
}
3333

3434
func (d *ShareDataSource) Schema(ctx context.Context, req datasource.SchemaRequest, resp *datasource.SchemaResponse) {

internal/providers/pluginfw/products/sharing/data_shares.go

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -52,7 +52,7 @@ type SharesDataSource struct {
5252
}
5353

5454
func (d *SharesDataSource) Metadata(ctx context.Context, req datasource.MetadataRequest, resp *datasource.MetadataResponse) {
55-
resp.TypeName = pluginfwcommon.GetDatabricksProductionName(dataSourceNameShares)
55+
resp.TypeName = pluginfwcommon.GetDatabricksStagingName(dataSourceNameShares)
5656
}
5757

5858
func (d *SharesDataSource) Schema(ctx context.Context, req datasource.SchemaRequest, resp *datasource.SchemaResponse) {

internal/providers/pluginfw/products/sharing/data_shares_acc_test.go

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -12,8 +12,8 @@ import (
1212

1313
func checkSharesDataSourcePopulated(t *testing.T) func(s *terraform.State) error {
1414
return func(s *terraform.State) error {
15-
ds, ok := s.Modules[0].Resources["data.databricks_shares.this"]
16-
require.True(t, ok, "data.databricks_shares.this has to be there")
15+
ds, ok := s.Modules[0].Resources["data.databricks_shares_pluginframework.this"]
16+
require.True(t, ok, "data.databricks_shares_pluginframework.this has to be there")
1717
num_shares, _ := strconv.Atoi(ds.Primary.Attributes["shares.#"])
1818
assert.GreaterOrEqual(t, num_shares, 1)
1919
return nil
@@ -71,7 +71,7 @@ func TestUcAccDataSourceShares(t *testing.T) {
7171
}
7272
}
7373
74-
resource "databricks_share" "myshare" {
74+
resource "databricks_share_pluginframework" "myshare" {
7575
name = "{var.RANDOM}-terraform-delta-share"
7676
object {
7777
name = databricks_table.mytable.id
@@ -86,8 +86,8 @@ func TestUcAccDataSourceShares(t *testing.T) {
8686
}
8787
}
8888
89-
data "databricks_shares" "this" {
90-
depends_on = [databricks_share.myshare]
89+
data "databricks_shares_pluginframework" "this" {
90+
depends_on = [databricks_share_pluginframework.myshare]
9191
}
9292
`,
9393
Check: checkSharesDataSourcePopulated(t),

internal/providers/pluginfw/products/sharing/resource_acc_test.go

Lines changed: 7 additions & 149 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,10 @@
11
package sharing_test
22

33
import (
4-
"context"
54
"fmt"
65
"testing"
76

87
"github.com/databricks/terraform-provider-databricks/internal/acceptance"
9-
"github.com/databricks/terraform-provider-databricks/internal/providers"
10-
"github.com/hashicorp/terraform-plugin-go/tfprotov6"
118
)
129

1310
const preTestTemplate = `
@@ -94,7 +91,7 @@ const preTestTemplateUpdate = `
9491
func TestUcAccCreateShare(t *testing.T) {
9592
acceptance.UnityWorkspaceLevel(t, acceptance.Step{
9693
Template: preTestTemplate + `
97-
resource "databricks_share" "myshare" {
94+
resource "databricks_share_pluginframework" "myshare" {
9895
name = "{var.STICKY_RANDOM}-terraform-delta-share"
9996
owner = "account users"
10097
object {
@@ -122,7 +119,7 @@ func TestUcAccCreateShare(t *testing.T) {
122119
}
123120
124121
resource "databricks_grants" "some" {
125-
share = databricks_share.myshare.name
122+
share = databricks_share_pluginframework.myshare.name
126123
grant {
127124
principal = databricks_recipient.db2open.name
128125
privileges = ["SELECT"]
@@ -134,7 +131,7 @@ func TestUcAccCreateShare(t *testing.T) {
134131

135132
func shareTemplateWithOwner(comment string, owner string) string {
136133
return fmt.Sprintf(`
137-
resource "databricks_share" "myshare" {
134+
resource "databricks_share_pluginframework" "myshare" {
138135
name = "{var.STICKY_RANDOM}-terraform-delta-share"
139136
owner = "%s"
140137
object {
@@ -162,7 +159,7 @@ func TestUcAccUpdateShare(t *testing.T) {
162159
func TestUcAccUpdateShareAddObject(t *testing.T) {
163160
acceptance.UnityWorkspaceLevel(t, acceptance.Step{
164161
Template: preTestTemplate + preTestTemplateUpdate +
165-
`resource "databricks_share" "myshare" {
162+
`resource "databricks_share_pluginframework" "myshare" {
166163
name = "{var.STICKY_RANDOM}-terraform-delta-share"
167164
owner = "account users"
168165
object {
@@ -181,7 +178,7 @@ func TestUcAccUpdateShareAddObject(t *testing.T) {
181178
}`,
182179
}, acceptance.Step{
183180
Template: preTestTemplate + preTestTemplateUpdate +
184-
`resource "databricks_share" "myshare" {
181+
`resource "databricks_share_pluginframework" "myshare" {
185182
name = "{var.STICKY_RANDOM}-terraform-delta-share"
186183
owner = "account users"
187184
object {
@@ -209,7 +206,7 @@ func TestUcAccUpdateShareAddObject(t *testing.T) {
209206
func TestUcAccUpdateShareReorderObject(t *testing.T) {
210207
acceptance.UnityWorkspaceLevel(t, acceptance.Step{
211208
Template: preTestTemplate + preTestTemplateUpdate +
212-
`resource "databricks_share" "myshare" {
209+
`resource "databricks_share_pluginframework" "myshare" {
213210
name = "{var.STICKY_RANDOM}-terraform-delta-share"
214211
owner = "account users"
215212
object {
@@ -223,7 +220,7 @@ func TestUcAccUpdateShareReorderObject(t *testing.T) {
223220
}`,
224221
}, acceptance.Step{
225222
Template: preTestTemplate + preTestTemplateUpdate +
226-
`resource "databricks_share" "myshare" {
223+
`resource "databricks_share_pluginframework" "myshare" {
227224
name = "{var.STICKY_RANDOM}-terraform-delta-share"
228225
owner = "account users"
229226
object {
@@ -237,142 +234,3 @@ func TestUcAccUpdateShareReorderObject(t *testing.T) {
237234
}`,
238235
})
239236
}
240-
241-
// TestUcAccUpdateShareNoChanges tests that updating a share with no actual changes doesn't cause issues
242-
func TestUcAccUpdateShareNoChanges(t *testing.T) {
243-
shareConfig := preTestTemplate + preTestTemplateUpdate +
244-
`resource "databricks_share" "myshare" {
245-
name = "{var.STICKY_RANDOM}-terraform-delta-share"
246-
owner = "account users"
247-
object {
248-
name = databricks_table.mytable.id
249-
comment = "stable comment"
250-
data_object_type = "TABLE"
251-
}
252-
}`
253-
254-
acceptance.UnityWorkspaceLevel(t, acceptance.Step{
255-
Template: shareConfig,
256-
}, acceptance.Step{
257-
Template: shareConfig, // Same config - should not trigger any updates
258-
})
259-
}
260-
261-
// TestUcAccUpdateShareComplexObjectChanges tests complex scenarios with multiple object updates
262-
func TestUcAccUpdateShareComplexObjectChanges(t *testing.T) {
263-
acceptance.UnityWorkspaceLevel(t, acceptance.Step{
264-
Template: preTestTemplate + preTestTemplateUpdate +
265-
`resource "databricks_share" "myshare" {
266-
name = "{var.STICKY_RANDOM}-terraform-delta-share"
267-
owner = "account users"
268-
object {
269-
name = databricks_table.mytable.id
270-
comment = "original comment"
271-
data_object_type = "TABLE"
272-
}
273-
object {
274-
name = databricks_table.mytable_2.id
275-
comment = "second table"
276-
data_object_type = "TABLE"
277-
}
278-
}`,
279-
}, acceptance.Step{
280-
// Remove one object, add another, and update comment on existing
281-
Template: preTestTemplate + preTestTemplateUpdate +
282-
`resource "databricks_share" "myshare" {
283-
name = "{var.STICKY_RANDOM}-terraform-delta-share"
284-
owner = "account users"
285-
object {
286-
name = databricks_table.mytable.id
287-
comment = "updated comment"
288-
data_object_type = "TABLE"
289-
}
290-
object {
291-
name = databricks_table.mytable_3.id
292-
comment = "third table"
293-
data_object_type = "TABLE"
294-
}
295-
}`,
296-
})
297-
}
298-
299-
// TestUcAccUpdateShareRemoveAllObjects tests removing all objects from a share
300-
func TestUcAccUpdateShareRemoveAllObjects(t *testing.T) {
301-
acceptance.UnityWorkspaceLevel(t, acceptance.Step{
302-
Template: preTestTemplate + preTestTemplateUpdate +
303-
`resource "databricks_share" "myshare" {
304-
name = "{var.STICKY_RANDOM}-terraform-delta-share"
305-
owner = "account users"
306-
object {
307-
name = databricks_table.mytable.id
308-
comment = "to be removed"
309-
data_object_type = "TABLE"
310-
}
311-
object {
312-
name = databricks_table.mytable_2.id
313-
comment = "also to be removed"
314-
data_object_type = "TABLE"
315-
}
316-
}`,
317-
}, acceptance.Step{
318-
Template: preTestTemplate + preTestTemplateUpdate +
319-
`resource "databricks_share" "myshare" {
320-
name = "{var.STICKY_RANDOM}-terraform-delta-share"
321-
owner = "account users"
322-
}`,
323-
})
324-
}
325-
326-
// TestUcAccShareMigrationFromSDKv2 tests the transition from sdkv2 to plugin framework.
327-
// This test verifies that existing state created by SDK v2 implementation can be
328-
// successfully managed by the plugin framework implementation without any changes.
329-
func TestUcAccShareMigrationFromSDKv2(t *testing.T) {
330-
acceptance.UnityWorkspaceLevel(t,
331-
// Step 1: Create share using SDK v2 implementation
332-
acceptance.Step{
333-
ProtoV6ProviderFactories: map[string]func() (tfprotov6.ProviderServer, error){
334-
"databricks": func() (tfprotov6.ProviderServer, error) {
335-
sdkv2Provider, pluginfwProvider := acceptance.ProvidersWithResourceFallbacks([]string{"databricks_share"})
336-
return providers.GetProviderServer(context.Background(), providers.WithSdkV2Provider(sdkv2Provider), providers.WithPluginFrameworkProvider(pluginfwProvider))
337-
},
338-
},
339-
Template: preTestTemplate + preTestTemplateUpdate + `
340-
resource "databricks_share" "myshare" {
341-
name = "{var.STICKY_RANDOM}-terraform-migration-share"
342-
owner = "account users"
343-
object {
344-
name = databricks_table.mytable.id
345-
comment = "Shared table for migration test"
346-
data_object_type = "TABLE"
347-
}
348-
object {
349-
name = databricks_table.mytable_2.id
350-
comment = "Second shared table"
351-
data_object_type = "TABLE"
352-
cdf_enabled = false
353-
}
354-
}`,
355-
},
356-
// Step 2: Update the share using plugin framework implementation (default)
357-
// This verifies no changes are needed when switching implementations
358-
acceptance.Step{
359-
ExpectNonEmptyPlan: false,
360-
Template: preTestTemplate + preTestTemplateUpdate + `
361-
resource "databricks_share" "myshare" {
362-
name = "{var.STICKY_RANDOM}-terraform-migration-share"
363-
owner = "account users"
364-
object {
365-
name = databricks_table.mytable.id
366-
comment = "Updated comment after migration"
367-
data_object_type = "TABLE"
368-
}
369-
object {
370-
name = databricks_table.mytable_2.id
371-
comment = "Second shared table"
372-
data_object_type = "TABLE"
373-
cdf_enabled = false
374-
}
375-
}`,
376-
},
377-
)
378-
}

internal/providers/pluginfw/products/sharing/resource_share.go

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -140,7 +140,7 @@ type ShareResource struct {
140140
}
141141

142142
func (r *ShareResource) Metadata(ctx context.Context, req resource.MetadataRequest, resp *resource.MetadataResponse) {
143-
resp.TypeName = pluginfwcommon.GetDatabricksProductionName(resourceName)
143+
resp.TypeName = pluginfwcommon.GetDatabricksStagingName(resourceName)
144144
}
145145

146146
func (r *ShareResource) Schema(ctx context.Context, req resource.SchemaRequest, resp *resource.SchemaResponse) {

0 commit comments

Comments
 (0)