Skip to content

Commit 54432e7

Browse files
authored
Added support for running databricks_cluster init scripts from workspace files (#2251)
1 parent eb63228 commit 54432e7

File tree

2 files changed

+22
-6
lines changed

2 files changed

+22
-6
lines changed

clusters/clusters_api.go

Lines changed: 11 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -205,6 +205,11 @@ type LocalFileInfo struct {
205205
Destination string `json:"destination,omitempty"`
206206
}
207207

208+
// WorkspaceFileInfo represents a file in the Databricks workspace.
209+
type WorkspaceFileInfo struct {
210+
Destination string `json:"destination,omitempty"`
211+
}
212+
208213
// StorageInfo contains the struct for either DBFS or S3 storage depending on which one is relevant.
209214
type StorageInfo struct {
210215
Dbfs *DbfsStorageInfo `json:"dbfs,omitempty" tf:"group:storage"`
@@ -213,11 +218,12 @@ type StorageInfo struct {
213218

214219
// InitScriptStorageInfo captures the allowed sources of init scripts.
215220
type InitScriptStorageInfo struct {
216-
Dbfs *DbfsStorageInfo `json:"dbfs,omitempty" tf:"group:storage"`
217-
Gcs *GcsStorageInfo `json:"gcs,omitempty" tf:"group:storage"`
218-
S3 *S3StorageInfo `json:"s3,omitempty" tf:"group:storage"`
219-
Abfss *AbfssStorageInfo `json:"abfss,omitempty" tf:"group:storage"`
220-
File *LocalFileInfo `json:"file,omitempty"`
221+
Dbfs *DbfsStorageInfo `json:"dbfs,omitempty" tf:"group:storage"`
222+
Gcs *GcsStorageInfo `json:"gcs,omitempty" tf:"group:storage"`
223+
S3 *S3StorageInfo `json:"s3,omitempty" tf:"group:storage"`
224+
Abfss *AbfssStorageInfo `json:"abfss,omitempty" tf:"group:storage"`
225+
File *LocalFileInfo `json:"file,omitempty"`
226+
Workspace *WorkspaceFileInfo `json:"workspace,omitempty"`
221227
}
222228

223229
// SparkNodeAwsAttributes is the struct that determines if the node is a spot instance or not

docs/resources/cluster.md

Lines changed: 11 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -256,7 +256,17 @@ There are a few more advanced attributes for S3 log delivery:
256256

257257
To run a particular init script on all clusters within the same workspace, both automated/job and interactive/all-purpose cluster types, please consider the [databricks_global_init_script](global_init_script.md) resource.
258258

259-
It is possible to specify up to 10 different cluster-scoped init scripts per cluster. Like the `cluster_log_conf` configuration block, init scripts support DBFS and cloud storage locations.
259+
It is possible to specify up to 10 different cluster-scoped init scripts per cluster. Init scripts support DBFS, cloud storage locations, and workspace files.
260+
261+
Example of using a Databricks workspace file as init script:
262+
263+
```hcl
264+
init_scripts {
265+
workspace {
266+
destination = "/Users/user@domain/install-elk.sh"
267+
}
268+
}
269+
```
260270

261271
Example of taking init script from DBFS:
262272

0 commit comments

Comments
 (0)