Skip to content

Commit e476d0b

Browse files
authored
generate basic terraform registry style documentation (#122)
* generate basic documentation to test terraform registry documentation rendering * removed un-wanted tab
1 parent f00e41b commit e476d0b

File tree

2 files changed

+313
-0
lines changed

2 files changed

+313
-0
lines changed

docs/index.md

Lines changed: 276 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,276 @@
1+
---
2+
layout: "databricks"
3+
page_title: "Provider: Databricks"
4+
sidebar_current: "docs-databricks-index"
5+
description: |-
6+
Terraform provider databricks.
7+
---
8+
9+
# Databricks Provider
10+
11+
The Databricks provider is what is used to interact with the Databricks resources. This needs to be configured so that
12+
terraform can provision resources in your Databricks workspace on your behalf.
13+
14+
## Example Usage
15+
16+
### Token Based Auth
17+
18+
``` hcl
19+
provider "databricks" {
20+
host = "http://databricks.domain.com"
21+
token = "dapitokenhere"
22+
}
23+
24+
resource "databricks_scim_user" "my-user" {
25+
user_name = "[email protected]"
26+
display_name = "Test User"
27+
}
28+
```
29+
30+
### Basic Auth
31+
32+
``` hcl
33+
provider "databricks" {
34+
host = "http://databricks.domain.com"
35+
basic_auth {
36+
username = var.user
37+
password = var.password
38+
}
39+
}
40+
41+
resource "databricks_scim_user" "my-user" {
42+
user_name = "[email protected]"
43+
display_name = "Test User"
44+
}
45+
```
46+
47+
### Profile Based Auth
48+
49+
``` hcl
50+
provider "databricks" {
51+
config_file = "~/.databrickscfg"
52+
profile = "DEFAULT"
53+
}
54+
55+
resource "databricks_scim_user" "my-user" {
56+
user_name = "[email protected]"
57+
display_name = "Test User"
58+
}
59+
```
60+
61+
### Azure SP Auth
62+
63+
``` hcl
64+
provider "azurerm" {
65+
client_id = var.client_id
66+
client_secret = var.client_secret
67+
tenant_id = var.tenant_id
68+
subscription_id = var.subscription_id
69+
}
70+
71+
resource "azurerm_databricks_workspace" "demo_test_workspace" {
72+
location = "centralus"
73+
name = "my-workspace-name"
74+
resource_group_name = var.resource_group
75+
managed_resource_group_name = var.managed_resource_group_name
76+
sku = "premium"
77+
}
78+
79+
provider "databricks" {
80+
azure_auth = {
81+
managed_resource_group = azurerm_databricks_workspace.demo_test_workspace.managed_resource_group_name
82+
azure_region = azurerm_databricks_workspace.demo_test_workspace.location
83+
workspace_name = azurerm_databricks_workspace.demo_test_workspace.name
84+
resource_group = azurerm_databricks_workspace.demo_test_workspace.resource_group_name
85+
client_id = var.client_id
86+
client_secret = var.client_secret
87+
tenant_id = var.tenant_id
88+
subscription_id = var.subscription_id
89+
}
90+
}
91+
92+
resource "databricks_scim_user" "my-user" {
93+
user_name = "[email protected]"
94+
display_name = "Test User"
95+
}
96+
```
97+
98+
99+
!> **Warning** Please be aware that hard coding credentials is not something that is recommended. It may be best if
100+
you store the credentials environment variables or use tfvars file.
101+
102+
## Authentication
103+
104+
There are currently two supported methods to authenticate into the Databricks platform to create resources.
105+
106+
* **API Token**
107+
* **Azure Service Principal Authentication**
108+
109+
-> **Note** **Azure Service Principal Authentication** will only work on Azure Databricks where as the API Token
110+
authentication will work on both **Azure** and **AWS**
111+
112+
113+
### API Token
114+
115+
Databricks hostname for the workspace and api token can be provided here. This configuration is very similar to the
116+
Databricks CLI
117+
118+
``` hcl
119+
provider "databricks" {
120+
host = "http://databricks.domain.com"
121+
token = "dapitokenhere"
122+
}
123+
```
124+
125+
!> **Warning** Please be aware that hard coding credentials is not something that is recommended.
126+
It may be best if you store the credentials environment variables or use tfvars file.
127+
128+
129+
130+
### Azure Service Principal Auth
131+
132+
``` hcl
133+
provider "databricks" {
134+
azure_auth = {
135+
managed_resource_group = "${azurerm_databricks_workspace.sri_test_workspace.managed_resource_group_name}"
136+
azure_region = "${azurerm_databricks_workspace.sri_test_workspace.location}"
137+
workspace_name = "${azurerm_databricks_workspace.sri_test_workspace.name}"
138+
resource_group = "${azurerm_databricks_workspace.sri_test_workspace.resource_group_name}"
139+
client_id = "${var.client_id}"
140+
client_secret = "${var.client_secret}"
141+
tenant_id = "${var.tenant_id}"
142+
subscription_id = "${var.subscription_id}"
143+
}
144+
}
145+
```
146+
147+
### Environment variables
148+
149+
The following variables can be passed via environment variables:
150+
151+
* `host``DATABRICKS_HOST`
152+
* `token``DATABRICKS_TOKEN`
153+
* `basic_auth.username``DATABRICKS_USERNAME`
154+
* `basic_auth.password``DATABRICKS_PASSWORD`
155+
* `config_file``DATABRICKS_CONFIG_FILE`
156+
* `managed_resource_group``DATABRICKS_AZURE_MANAGED_RESOURCE_GROUP`
157+
* `azure_region``AZURE_REGION`
158+
* `workspace_name``DATABRICKS_AZURE_WORKSPACE_NAME`
159+
* `resource_group``DATABRICKS_AZURE_RESOURCE_GROUP`
160+
* `subscription_id``DATABRICKS_AZURE_SUBSCRIPTION_ID` or `ARM_SUBSCRIPTION_ID`
161+
* `client_secret``DATABRICKS_AZURE_CLIENT_SECRET` or `ARM_CLIENT_SECRET`
162+
* `client_id``DATABRICKS_AZURE_CLIENT_ID` or `ARM_CLIENT_ID`
163+
* `tenant_id``DATABRICKS_AZURE_TENANT_ID` or `ARM_TENANT_ID`
164+
165+
For example you can have the following provider definition:
166+
167+
``` hcl
168+
provider "databricks" {}
169+
```
170+
171+
Then run the following code and the following environment variables will be injected into the provider.
172+
173+
``` bash
174+
$ export HOST="http://databricks.domain.com"
175+
$ export TOKEN="dapitokenhere"
176+
$ terraform plan
177+
```
178+
179+
## Argument Reference
180+
181+
The following arguments are supported by the db provider block:
182+
183+
* `host` - (optional) This is the host of the Databricks workspace. This is will be a url that you use to login to your workspace.
184+
Alternatively you can provide this value as an environment variable `DATABRICKS_HOST`.
185+
186+
* `token` - (optional) This is the api token to authenticate into the workspace. Alternatively you can provide this value as an
187+
environment variable `DATABRICKS_TOKEN`.
188+
189+
* `basic_auth` - (optional) This is a basic_auth block ([documented below](#basic_auth-configuration-block)) to authenticate to the Databricks via basic auth through a user
190+
that has access to the workspace. This is optional as you can use the api token based auth.
191+
192+
* `config_file` - (optional) Location of the Databricks CLI credentials file, that is created, by `databricks configure --token` command.
193+
By default, it is located in ~/.databrickscfg. Check https://docs.databricks.com/dev-tools/cli/index.html#set-up-authentication
194+
for docs. Config file credentials will only be used when host/token/basic_auth/azure_auth are not provided.
195+
Alternatively you can provide this value as an environment variable `DATABRICKS_CONFIG_FILE`. This field defaults to
196+
`~/.databrickscfg`.
197+
198+
* `profile` - (optional) Connection profile specified within ~/.databrickscfg. Please check
199+
https://docs.databricks.com/dev-tools/cli/index.html#connection-profiles for documentation. This field defaults to
200+
`DEFAULT`.
201+
202+
* `azure_auth` - (optional) This is a azure_auth block ([documented below]((#azure_auth-configuration-block))) required to authenticate to the Databricks via an azure service
203+
principal that has access to the workspace. This is optional as you can use the api token based auth.
204+
205+
206+
207+
### basic_auth Configuration Block
208+
209+
Example:
210+
211+
```hcl
212+
basic_auth = {
213+
username = "user"
214+
password = "mypass-123"
215+
}
216+
```
217+
218+
The basic_auth block contains the following arguments:
219+
220+
* `username` - (required) This is the username of the user that can log into the workspace.
221+
Alternatively you can provide this value as an environment variable `DATABRICKS_USERNAME`.
222+
223+
* `password` - (required) This is the password of the user that can log into the workspace.
224+
Alternatively you can provide this value as an environment variable `DATABRICKS_PASSWORD`.
225+
226+
227+
### azure_auth Configuration Block
228+
229+
Example:
230+
231+
```hcl
232+
azure_auth = {
233+
azure_region = "centralus"
234+
managed_resource_group = "my-databricks-managed-rg"
235+
workspace_name = "test-managed-workspace"
236+
resource_group = "1-test-rg"
237+
client_id = var.client_id
238+
client_secret = var.client_secret
239+
tenant_id = var.tenant_id
240+
subscription_id = var.subscription_id
241+
}
242+
```
243+
244+
This is the authentication required to authenticate to the Databricks via an azure service
245+
principal that has access to the workspace. This is optional as you can use the api token based auth.
246+
The azure_auth block contains the following arguments:
247+
248+
* `managed_resource_group` - (required) This is the managed workgroup id when the Databricks workspace is provisioned.
249+
Alternatively you can provide this value as an environment variable `DATABRICKS_AZURE_MANAGED_RESOURCE_GROUP`.
250+
251+
* `azure_region` - (required) This is the azure region in which your workspace is deployed.
252+
Alternatively you can provide this value as an environment variable `AZURE_REGION`.
253+
254+
* `workspace_name` - (required) This is the name of your Azure Databricks Workspace.
255+
Alternatively you can provide this value as an environment variable `DATABRICKS_AZURE_WORKSPACE_NAME`.
256+
257+
* `resource_group` - (required) This is the resource group in which your Azure Databricks Workspace resides in.
258+
Alternatively you can provide this value as an environment variable `DATABRICKS_AZURE_RESOURCE_GROUP`.
259+
260+
* `subscription_id` - (required) This is the Azure Subscription id in which your Azure Databricks Workspace resides in.
261+
Alternatively you can provide this value as an environment variable `DATABRICKS_AZURE_SUBSCRIPTION_ID` or `ARM_SUBSCRIPTION_ID`.
262+
263+
* `client_secret` - (required) This is the Azure Enterprise Application (Service principal) client secret. This service
264+
principal requires contributor access to your Azure Databricks deployment. Alternatively you can provide this
265+
value as an environment variable `DATABRICKS_AZURE_CLIENT_SECRET` or `ARM_CLIENT_SECRET`.
266+
267+
* `client_id` - (required) This is the Azure Enterprise Application (Service principal) client id. This service principal
268+
requires contributor access to your Azure Databricks deployment. Alternatively you can provide this value as an
269+
environment variable `DATABRICKS_AZURE_CLIENT_ID` or `ARM_CLIENT_ID`.
270+
271+
* `tenant_id` - (required) This is the Azure Active Directory Tenant id in which the Enterprise Application (Service Principal)
272+
resides in. Alternatively you can provide this value as an environment variable `DATABRICKS_AZURE_TENANT_ID` or `ARM_TENANT_ID`.
273+
274+
Where there are multiple environment variable options, the `DATABRICKS_AZURE_*` environment variables takes precedence
275+
and the `ARM_*` environment variables provide a way to share authentication configuration when using the `databricks-terraform`
276+
provider alongside the `azurerm` provider.

docs/resources/secret_scope.md

Lines changed: 37 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,37 @@
1+
# databricks_secret_scope Resource
2+
3+
This resource creates a Databricks-backed secret scope in which secrets are stored in Databricks-managed storage and
4+
encrypted with a cloud-based specific encryption key.
5+
6+
The scope name:
7+
8+
* Must be unique within a workspace.
9+
* Must consist of alphanumeric characters, dashes, underscores, and periods, and may not exceed 128 characters.
10+
11+
## Example Usage
12+
13+
```hcl
14+
resource "databricks_secret_scope" "my-scope" {
15+
name = "terraform-demo-scope"
16+
initial_manage_principal = "users"
17+
}
18+
```
19+
20+
## Argument Reference
21+
22+
The following arguments are supported:
23+
24+
* `name` - (Required) Scope name requested by the user. Scope names are unique. This field is required.
25+
26+
* `initial_manage_principal` - (Optional) The principal that is initially granted
27+
MANAGE permission to the created scope.
28+
29+
## Attribute Reference
30+
31+
In addition to all arguments above, the following attributes are exported:
32+
33+
* `id` - The id for the secret scope object.
34+
35+
## Import
36+
37+
-> **Note** Importing this resource is not currently supported.

0 commit comments

Comments
 (0)