Skip to content

Commit 50d9f4a

Browse files
authored
Release v1.0.0 (#1383)
1 parent f29e655 commit 50d9f4a

20 files changed

+460
-344
lines changed

.github/ISSUE_TEMPLATE/provider-issue.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@ Please list the steps required to reproduce the issue, for example:
2525

2626
### Terraform and provider versions
2727

28-
Please paste the output of `terraform version`. If version of `databricks` provider is not the latest (https://github.com/databrickslabs/terraform-provider-databricks/releases), please make sure to use the latest one.
28+
Please paste the output of `terraform version`. If version of `databricks` provider is not the latest (https://github.com/databricks/terraform-provider-databricks/releases), please make sure to use the latest one.
2929

3030
### Debug Output
3131
Please add turn on logging, e.g. `TF_LOG=DEBUG terraform apply` and run command again, paste it to gist & provide the link to gist. If you're still willing to paste in log output, make sure you provide only relevant log lines with requests.

.github/workflows/push.yml

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,6 @@ name: build
33
on:
44
pull_request:
55
types: [opened, synchronize]
6-
paths-ignore: ['**.md']
76
push:
87
branches: [master]
98

CHANGELOG.md

Lines changed: 254 additions & 237 deletions
Large diffs are not rendered by default.

CONTRIBUTING.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -15,15 +15,15 @@ We happily welcome contributions to databricks-terraform. We use GitHub Issues t
1515
If you use Terraform 0.12, please execute the following curl command in your shell:
1616

1717
```bash
18-
curl https://raw.githubusercontent.com/databrickslabs/databricks-terraform/master/godownloader-databricks-provider.sh | bash -s -- -b $HOME/.terraform.d/plugins
18+
curl https://raw.githubusercontent.com/databricks/databricks-terraform/master/godownloader-databricks-provider.sh | bash -s -- -b $HOME/.terraform.d/plugins
1919
```
2020

2121
## Installing from source
2222

2323
On MacOS X, you can install GoLang through `brew install go`, on Debian-based Linux, you can install it by `sudo apt-get install golang -y`.
2424

2525
```bash
26-
git clone https://github.com/databrickslabs/terraform-provider-databricks.git
26+
git clone https://github.com/databricks/terraform-provider-databricks.git
2727
cd terraform-provider-databricks
2828
make install
2929
```
@@ -46,7 +46,7 @@ In order to simplify development workflow, you should use [dev_overrides](https:
4646
$ cat ~/.terraformrc
4747
provider_installation {
4848
dev_overrides {
49-
"databrickslabs/databricks" = "provider-binary"
49+
"databricks/databricks" = "provider-binary"
5050
}
5151
direct {}
5252
}

README.md

Lines changed: 21 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -2,10 +2,10 @@
22

33
![Resources](docs/resources.png)
44

5-
[AWS](docs/guides/aws-workspace.md) tutorial
5+
[Troubleshooting Guide](docs/guides/troubleshooting.md)
6+
| [AWS](docs/guides/aws-workspace.md) tutorial
67
| [Azure](docs/guides/azure-workspace.md) tutorial
78
| [End-to-end](docs/guides/workspace-management.md) tutorial
8-
| Migration from [0.3.x to 0.4.x](docs/guides/migration-0.4.x.md)
99
| [Changelog](CHANGELOG.md)
1010
| [Authentication](docs/index.md)
1111
| [databricks_aws_assume_role_policy](docs/data-sources/aws_assume_role_policy.md) data
@@ -85,16 +85,16 @@
8585
| [databricks_zones](docs/data-sources/zones.md)
8686
| [Contributing and Development Guidelines](CONTRIBUTING.md)
8787

88-
[![build](https://github.com/databrickslabs/terraform-provider-databricks/workflows/build/badge.svg?branch=master)](https://github.com/databrickslabs/terraform-provider-databricks/actions?query=workflow%3Abuild+branch%3Amaster) [![codecov](https://codecov.io/gh/databrickslabs/terraform-provider-databricks/branch/master/graph/badge.svg)](https://codecov.io/gh/databrickslabs/terraform-provider-databricks) ![lines](https://img.shields.io/tokei/lines/github/databrickslabs/terraform-provider-databricks) [![downloads](https://img.shields.io/github/downloads/databrickslabs/terraform-provider-databricks/total.svg)](https://hanadigital.github.io/grev/?user=databrickslabs&repo=terraform-provider-databricks)
88+
[![build](https://github.com/databricks/terraform-provider-databricks/workflows/build/badge.svg?branch=master)](https://github.com/databricks/terraform-provider-databricks/actions?query=workflow%3Abuild+branch%3Amaster) [![codecov](https://codecov.io/gh/databricks/terraform-provider-databricks/branch/master/graph/badge.svg)](https://codecov.io/gh/databricks/terraform-provider-databricks) ![lines](https://img.shields.io/tokei/lines/github/databricks/terraform-provider-databricks) [![downloads](https://img.shields.io/github/downloads/databricks/terraform-provider-databricks/total.svg)](https://hanadigital.github.io/grev/?user=databricks&repo=terraform-provider-databricks)
8989

90-
If you use Terraform 0.13 or newer, please refer to instructions specified at [registry page](https://registry.terraform.io/providers/databrickslabs/databricks/latest). If you use older versions of Terraform or want to build it from sources, please refer to [contributing guidelines](CONTRIBUTING.md) page.
90+
If you use Terraform 0.13 or newer, please refer to instructions specified at [registry page](https://registry.terraform.io/providers/databricks/databricks/latest). If you use older versions of Terraform or want to build it from sources, please refer to [contributing guidelines](CONTRIBUTING.md) page.
9191

9292
```hcl
9393
terraform {
9494
required_providers {
9595
databricks = {
96-
source = "databrickslabs/databricks"
97-
version = "0.6.2"
96+
source = "databricks/databricks"
97+
version = "1.0.0"
9898
}
9999
}
100100
}
@@ -149,6 +149,19 @@ output "job_url" {
149149

150150
Then run `terraform init` then `terraform apply` to apply the hcl code to your Databricks workspace.
151151

152-
## Project Support
152+
# Switching from `databrickslabs` to `databricks` namespace
153153

154-
**Important:** Projects in the `databrickslabs` GitHub account, including the Databricks Terraform Provider, are not formally supported by Databricks. They are maintained by Databricks Field teams and provided as-is. There is no service level agreement (SLA). Databricks makes no guarantees of any kind. If you discover an issue with the provider, please file a GitHub Issue on the repo, and it will be reviewed by project maintainers as time permits.
154+
To make Databricks Terraform Provider generally available, we've moved it from [https://github.com/databrickslabs](https://github.com/databrickslabs) to [https://github.com/databricks](https://github.com/databricks). We've worked closely with the Terraform Registry team at Hashicorp to ensure a smooth migration. Existing terraform deployments continue to work as expected without any action from your side. We ask you to replace `databrickslabs/databricks` with `databricks/databricks` in all your `.tf` files.
155+
156+
You should have [`.terraform.lock.hcl`](https://github.com/databrickslabs/terraform-provider-databricks/blob/v0.6.2/scripts/versions-lock.hcl) file in your state directory that is checked into source control. terraform init will give you the following warning.
157+
158+
```
159+
Warning: Additional provider information from registry
160+
161+
The remote registry returned warnings for registry.terraform.io/databrickslabs/databricks:
162+
- For users on Terraform 0.13 or greater, this provider has moved to databricks/databricks. Please update your source in required_providers.
163+
```
164+
165+
After you replace `databrickslabs/databricks` with `databricks/databricks` in the `required_providers` block, the warning will disappear. Do a global "search and replace" in `*.tf` files. Alternatively you can run `python3 -c "$(curl -Ls https://dbricks.co/updtfns)"` from the command-line, that would do all the boring work for you.
166+
167+
If you didn't check-in [`.terraform.lock.hcl`](https://www.terraform.io/language/files/dependency-lock#lock-file-location) to the source code version control, you may you may see `Failed to install provider` error. Please follow the simple steps described in the [troubleshooting guide](docs/guides/troubleshooting.md).

common/version.go

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ package common
33
import "context"
44

55
var (
6-
version = "0.6.2"
6+
version = "1.0.0"
77
// ResourceName is resource name without databricks_ prefix
88
ResourceName contextKey = 1
99
// Provider is the current instance of provider

docs/guides/aws-e2-firewall-hub-and-spoke.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ page_title: "Provisioning AWS Databricks E2 with a Hub & Spoke firewall for data
66

77
You can provision multiple Databricks workspaces with Terraform and where many Databricks workspaces are deployed, we recommend a hub and spoke topology reference architecture, powered by AWS Transit Gateway. The hub will consist of a central inspection and egress virtual private cloud (VPC), while the Spoke VPC houses federated Databricks workspaces for different business units or segregated teams. In this way, you create your own version of a centralized deployment model for your egress architecture, as is recommended for large enterprises. For more information please visit [Data Exfiltration Protection With Databricks on AWS](https://databricks.com/blog/2021/02/02/data-exfiltration-protection-with-databricks-on-aws.html).
88

9-
![Data Exfiltration](https://raw.githubusercontent.com/databrickslabs/terraform-provider-databricks/master/docs/images/aws-exfiltration-replace-1.png)
9+
![Data Exfiltration](https://raw.githubusercontent.com/databricks/terraform-provider-databricks/master/docs/images/aws-exfiltration-replace-1.png)
1010

1111
## Provider initialization for E2 workspaces
1212

@@ -90,7 +90,7 @@ Before [managing workspace](workspace-management.md), you have to create:
9090
terraform {
9191
required_providers {
9292
databricks = {
93-
source = "databrickslabs/databricks"
93+
source = "databricks/databricks"
9494
}
9595
aws = {
9696
source = "hashicorp/aws"
@@ -120,7 +120,7 @@ The very first step is Hub & Spoke VPC creation. Please consult [main documentat
120120

121121
First step is to create Spoke VPC which houses federated Databricks workspaces for different business units or segregated teams.
122122

123-
![SpokeVPC](https://raw.githubusercontent.com/databrickslabs/terraform-provider-databricks/master/docs/images/aws-e2-firewall-spoke-vpc.png)
123+
![SpokeVPC](https://raw.githubusercontent.com/databricks/terraform-provider-databricks/master/docs/images/aws-e2-firewall-spoke-vpc.png)
124124

125125
```hcl
126126
data "aws_availability_zones" "available" {}
@@ -303,7 +303,7 @@ module "vpc_endpoints" {
303303
### Hub VPC
304304
The hub will consist of a central inspection and egress virtual private cloud (VPC). We're going to create a central inspection/egress VPC, which once we’ve finished should look like this:
305305

306-
![HubVPC](https://raw.githubusercontent.com/databrickslabs/terraform-provider-databricks/master/docs/images/aws-e2-firewall-hub-vpc.png)
306+
![HubVPC](https://raw.githubusercontent.com/databricks/terraform-provider-databricks/master/docs/images/aws-e2-firewall-hub-vpc.png)
307307

308308
```hcl
309309
/* Create VPC */
@@ -461,7 +461,7 @@ Now that our spoke and inspection/egress VPCs are ready to go, all you need to d
461461
First, we're going to create a Transit Gateway and link our Databricks data plane via TGW subnets.
462462
All of the logic that determines what routes are going via a Transit Gateway is encapsulated within Transit Gateway Route Tables. We’re going to create some TGW routes tables for our Hub & Spoke networks.
463463

464-
![TransitGateway](https://raw.githubusercontent.com/databrickslabs/terraform-provider-databricks/master/docs/images/aws-e2-firewall-tgw.png)
464+
![TransitGateway](https://raw.githubusercontent.com/databricks/terraform-provider-databricks/master/docs/images/aws-e2-firewall-tgw.png)
465465

466466
```hcl
467467
//Create transit gateway
@@ -542,7 +542,7 @@ resource "aws_route" "hub_nat_to_tgw" {
542542
## AWS Network Firewall
543543
Once [VPC](#vpc) is ready, we're going to create AWS Network Firewall for your VPC that restricts outbound http/s traffic to an approved set of Fully Qualified Domain Names (FQDNs).
544544

545-
![AWS Network Firewall](https://raw.githubusercontent.com/databrickslabs/terraform-provider-databricks/master/docs/images/aws-e2-firewall-config.png)
545+
![AWS Network Firewall](https://raw.githubusercontent.com/databricks/terraform-provider-databricks/master/docs/images/aws-e2-firewall-config.png)
546546

547547
### AWS Firewall Rule Groups
548548

docs/guides/aws-e2-firewall-workspace.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ You can provision multiple Databricks workspaces with Terraform. This example sh
88

99
For more information please visit [Data Exfiltration Protection With Databricks on AWS](https://databricks.com/blog/2021/02/02/data-exfiltration-protection-with-databricks-on-aws.html).
1010

11-
![Data Exfiltration_Workspace](https://raw.githubusercontent.com/databrickslabs/terraform-provider-databricks/master/docs/images/aws-e2-firewall-workspace.png)
11+
![Data Exfiltration_Workspace](https://raw.githubusercontent.com/databricks/terraform-provider-databricks/master/docs/images/aws-e2-firewall-workspace.png)
1212

1313
## Provider initialization for E2 workspaces
1414

@@ -88,7 +88,7 @@ Before [managing workspace](workspace-management.md), you have to create:
8888
terraform {
8989
required_providers {
9090
databricks = {
91-
source = "databrickslabs/databricks"
91+
source = "databricks/databricks"
9292
}
9393
aws = {
9494
source = "hashicorp/aws"

docs/guides/aws-private-link-workspace.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ page_title: "Provisioning Databricks on AWS with PrivateLink"
88

99
Databricks PrivateLink support enables private connectivity between users and their Databricks workspaces and between clusters on the data plane and core services on the control plane within the Databricks workspace infrastructure. You can use Terraform to deploy the underlying cloud resources and the private access settings resources automatically, using a programmatic approach. This guide assumes you are deploying into an existing VPC and you have set up credentials and storage configurations as per prior examples, notably here.
1010

11-
![Private link backend](https://raw.githubusercontent.com/databrickslabs/terraform-provider-databricks/master/docs/images/aws-e2-private-link-backend.png)
11+
![Private link backend](https://raw.githubusercontent.com/databricks/terraform-provider-databricks/master/docs/images/aws-e2-private-link-backend.png)
1212

1313
This guide uses the following variables in configurations:
1414

@@ -23,8 +23,8 @@ This guide uses the following variables in configurations:
2323
- `relay_vpce_service` - Choose the region-specific service from this table.
2424
- `vpce_subnet_cidr` - CIDR range for the subnet chosen for the VPC endpoint.
2525
- `tags` - tags for the Private Link backend setup.
26-
- `root_bucket_name` - AWS bucket name required for [databricks_mws_storage_configurations](https://registry.terraform.io/providers/databrickslabs/databricks/latest/docs/resources/mws_storage_configurations).
27-
- `cross_account_arn` - AWS EC2 role ARN required for [databricks_mws_credentials](https://registry.terraform.io/providers/databrickslabs/databricks/latest/docs/resources/mws_credentials).
26+
- `root_bucket_name` - AWS bucket name required for [databricks_mws_storage_configurations](https://registry.terraform.io/providers/databricks/databricks/latest/docs/resources/mws_storage_configurations).
27+
- `cross_account_arn` - AWS EC2 role ARN required for [databricks_mws_credentials](https://registry.terraform.io/providers/databricks/databricks/latest/docs/resources/mws_credentials).
2828

2929
This guide is provided as-is and you can use this guide as the basis for your custom Terraform module.
3030

@@ -44,7 +44,7 @@ Initialize [provider with `mws` alias](https://www.terraform.io/language/provide
4444
terraform {
4545
required_providers {
4646
databricks = {
47-
source = "databrickslabs/databricks"
47+
source = "databricks/databricks"
4848
}
4949
aws = {
5050
source = "hashicorp/aws"
@@ -273,7 +273,7 @@ resource "databricks_mws_networks" "this" {
273273

274274
For a workspace to support any of the PrivateLink connectivity scenarios, the workspace must be created with an attached [databricks_mws_private_access_settings](../resources/mws_private_access_settings.md) resource.
275275

276-
The credentials ID which is referenced below is one of the attributes which is created as a result of configuring the cross-account IAM role, which Databricks uses to orchestrate EC2 resources. The credentials are created via [databricks_mws_credentials](https://registry.terraform.io/providers/databrickslabs/databricks/latest/docs/resources/mws_credentials). Similarly, the storage configuration ID is obtained from the [databricks_mws_storage_configurations](https://registry.terraform.io/providers/databrickslabs/databricks/latest/docs/resources/mws_storage_configurations) resource.
276+
The credentials ID which is referenced below is one of the attributes which is created as a result of configuring the cross-account IAM role, which Databricks uses to orchestrate EC2 resources. The credentials are created via [databricks_mws_credentials](https://registry.terraform.io/providers/databricks/databricks/latest/docs/resources/mws_credentials). Similarly, the storage configuration ID is obtained from the [databricks_mws_storage_configurations](https://registry.terraform.io/providers/databricks/databricks/latest/docs/resources/mws_storage_configurations) resource.
277277

278278
```hcl
279279
resource "databricks_mws_private_access_settings" "pas" {

docs/guides/aws-workspace.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ page_title: "Provisioning AWS Databricks E2"
66

77
You can provision multiple Databricks workspaces with Terraform.
88

9-
![Simplest multiworkspace](https://github.com/databrickslabs/terraform-provider-databricks/raw/master/docs/simplest-multiworkspace.png)
9+
![Simplest multiworkspace](https://github.com/databricks/terraform-provider-databricks/raw/master/docs/simplest-multiworkspace.png)
1010

1111
## Provider initialization for E2 workspaces
1212

@@ -53,7 +53,7 @@ Before [managing workspace](workspace-management.md), you have to create:
5353
terraform {
5454
required_providers {
5555
databricks = {
56-
source = "databrickslabs/databricks"
56+
source = "databricks/databricks"
5757
}
5858
aws = {
5959
source = "hashicorp/aws"
@@ -335,10 +335,10 @@ Error: MALFORMED_REQUEST: Failed credentials validation checks: Spot Cancellatio
335335

336336
- Try creating workspace from UI:
337337

338-
![create_workspace_error](https://github.com/databrickslabs/terraform-provider-databricks/raw/master/docs/images/create_workspace_error.png)
338+
![create_workspace_error](https://github.com/databricks/terraform-provider-databricks/raw/master/docs/images/create_workspace_error.png)
339339

340340

341341
- Verify if the role and policy exists (assume role should allow external id)
342342

343-
![iam_role_trust_error](https://github.com/databrickslabs/terraform-provider-databricks/raw/master/docs/images/iam_role_trust_error.png)
343+
![iam_role_trust_error](https://github.com/databricks/terraform-provider-databricks/raw/master/docs/images/iam_role_trust_error.png)
344344

0 commit comments

Comments
 (0)