Skip to content

Commit 1a7ddaa

Browse files
authored
Reformat code examples (#999)
Run of `terrafmt fmt -p '*.md' .`
1 parent 2ce4496 commit 1a7ddaa

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

60 files changed

+651
-647
lines changed

.github/ISSUE_TEMPLATE/provider-issue.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -39,4 +39,4 @@ TF_LOG=DEBUG terraform plan 2>&1 | grep databricks | sed -E 's/^.* plugin[^:]+:
3939
If Terraform produced a panic, please provide a link to a GitHub Gist containing the output of the `crash.log`.
4040

4141
### Important Factoids
42-
Are there anything atypical about your accounts that we should know?
42+
Are there anything atypical about your accounts that we should know?

Makefile

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,10 @@ fmt:
66
@echo "✓ Formatting source code with gofmt ..."
77
@gofmt -w $(shell find . -type f -name '*.go' -not -path "./vendor/*")
88

9+
fmt-docs:
10+
@echo "✓ Formatting code samples in documentation"
11+
@terrafmt fmt -p '*.md' .
12+
913
lint: vendor
1014
@echo "✓ Linting source code with https://staticcheck.io/ ..."
1115
@staticcheck ./...

README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -69,7 +69,7 @@ If you use Terraform 0.13 or newer, please refer to instructions specified at [r
6969
terraform {
7070
required_providers {
7171
databricks = {
72-
source = "databrickslabs/databricks"
72+
source = "databrickslabs/databricks"
7373
version = "0.4.1"
7474
}
7575
}
@@ -80,7 +80,7 @@ Then create a small sample file, named `main.tf` with approximately following co
8080

8181
```terraform
8282
provider "databricks" {
83-
host = "https://abc-defg-024.cloud.databricks.com/"
83+
host = "https://abc-defg-024.cloud.databricks.com/"
8484
token = "<your PAT token>"
8585
}
8686

docs/data-sources/aws_assume_role_policy.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -55,4 +55,4 @@ resource "databricks_mws_credentials" "this" {
5555

5656
In addition to all arguments above, the following attributes are exported:
5757

58-
* `json` - AWS IAM Policy JSON document
58+
* `json` - AWS IAM Policy JSON document

docs/data-sources/aws_bucket_policy.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -9,8 +9,8 @@ This datasource configures a simple access policy for AWS S3 buckets, so that Da
99

1010
```hcl
1111
resource "aws_s3_bucket" "this" {
12-
bucket = "<unique_bucket_name>"
13-
acl = "private"
12+
bucket = "<unique_bucket_name>"
13+
acl = "private"
1414
force_destroy = true
1515
}
1616
@@ -19,8 +19,8 @@ data "databricks_aws_bucket_policy" "stuff" {
1919
}
2020
2121
resource "aws_s3_bucket_policy" "this" {
22-
bucket = aws_s3_bucket.this.id
23-
policy = data.databricks_aws_bucket_policy.this.json
22+
bucket = aws_s3_bucket.this.id
23+
policy = data.databricks_aws_bucket_policy.this.json
2424
}
2525
```
2626

docs/data-sources/clusters.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -13,16 +13,16 @@ Retrieve all clusters on this workspace on AWS or GCP:
1313

1414
```hcl
1515
data "databricks_clusters" "all" {
16-
depends_on = [databricks_mws_workspaces.this]
16+
depends_on = [databricks_mws_workspaces.this]
1717
}
1818
```
1919

2020
Retrieve all clusters with "Shared" in their cluster name on this Azure Databricks workspace:
2121

2222
```hcl
2323
data "databricks_clusters" "all_shared" {
24-
depends_on = [azurerm_databricks_workspace.this]
25-
cluster_name_contains = "shared"
24+
depends_on = [azurerm_databricks_workspace.this]
25+
cluster_name_contains = "shared"
2626
}
2727
```
2828

docs/data-sources/dbfs_file.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -11,8 +11,8 @@ This data source allows to get file content from DBFS
1111

1212
```hcl
1313
data "databricks_dbfs_file" "report" {
14-
path = "dbfs:/reports/some.csv"
15-
limit_file_size = 10240
14+
path = "dbfs:/reports/some.csv"
15+
limit_file_size = 10240
1616
}
1717
```
1818
## Argument Reference
@@ -25,4 +25,4 @@ data "databricks_dbfs_file" "report" {
2525
This data source exports the following attributes:
2626

2727
* `content` - base64-encoded file contents
28-
* `file_size` - size of the file in bytes
28+
* `file_size` - size of the file in bytes

docs/data-sources/dbfs_file_paths.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -11,8 +11,8 @@ This data source allows to get list of file names from DBFS
1111

1212
```hcl
1313
data "databricks_dbfs_file_paths" "partitions" {
14-
path = "dbfs:/user/hive/default.db/table"
15-
recursive = false
14+
path = "dbfs:/user/hive/default.db/table"
15+
recursive = false
1616
}
1717
```
1818
## Argument Reference
@@ -24,4 +24,4 @@ data "databricks_dbfs_file_paths" "partitions" {
2424

2525
This data source exports the following attributes:
2626

27-
* `path_list` - returns list of objects with `path` and `file_size` attributes in each
27+
* `path_list` - returns list of objects with `path` and `file_size` attributes in each

docs/data-sources/group.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -13,15 +13,15 @@ Adding user to administrative group
1313

1414
```hcl
1515
data "databricks_group" "admins" {
16-
display_name = "admins"
16+
display_name = "admins"
1717
}
1818
1919
resource "databricks_user" "me" {
20-
user_name = "[email protected]"
20+
user_name = "[email protected]"
2121
}
2222
2323
resource "databricks_group_member" "my_member_a" {
24-
group_id = data.databricks_group.admins.id
24+
group_id = data.databricks_group.admins.id
2525
member_id = databricks_user.me.id
2626
}
2727
```

docs/data-sources/node_type.md

Lines changed: 13 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -13,26 +13,26 @@ Gets the smallest node type for [databricks_cluster](../resources/cluster.md) th
1313

1414
```hcl
1515
data "databricks_node_type" "with_gpu" {
16-
local_disk = true
17-
min_cores = 16
18-
gb_per_core = 1
19-
min_gpus = 1
16+
local_disk = true
17+
min_cores = 16
18+
gb_per_core = 1
19+
min_gpus = 1
2020
}
2121
2222
data "databricks_spark_version" "gpu_ml" {
2323
gpu = true
24-
ml = true
24+
ml = true
2525
}
2626
2727
resource "databricks_cluster" "research" {
28-
cluster_name = "Research Cluster"
29-
spark_version = data.databricks_spark_version.gpu_ml.id
30-
node_type_id = data.databricks_node_type.with_gpu.id
31-
autotermination_minutes = 20
32-
autoscale {
33-
min_workers = 1
34-
max_workers = 50
35-
}
28+
cluster_name = "Research Cluster"
29+
spark_version = data.databricks_spark_version.gpu_ml.id
30+
node_type_id = data.databricks_node_type.with_gpu.id
31+
autotermination_minutes = 20
32+
autoscale {
33+
min_workers = 1
34+
max_workers = 50
35+
}
3636
}
3737
```
3838

0 commit comments

Comments
 (0)