Skip to content

Commit 2ce4496

Browse files
authored
Update developer documentation (#997)
Get new developers started easier
1 parent bb43dd3 commit 2ce4496

File tree

1 file changed

+22
-137
lines changed

1 file changed

+22
-137
lines changed

CONTRIBUTING.md

Lines changed: 22 additions & 137 deletions
Original file line numberDiff line numberDiff line change
@@ -1,19 +1,11 @@
11
Contributing to Databricks Terraform Provider
22
---
33

4-
- [Contributing to Databricks Terraform Provider](#contributing-to-databricks-terraform-provider)
54
- [Installing from source](#installing-from-source)
65
- [Developing provider](#developing-provider)
7-
- [Developing with Visual Studio Code Devcontainers](#developing-with-visual-studio-code-devcontainers)
8-
- [Building and Installing with Docker](#building-and-installing-with-docker)
9-
- [Testing](#testing)
6+
- [Adding a new resource](#adding-a-new-resource)
107
- [Code conventions](#code-conventions)
118
- [Linting](#linting)
12-
- [Unit testing resources](#unit-testing-resources)
13-
- [Generating asserts for the first time in test](#generating-asserts-for-the-first-time-in-test)
14-
- [Random naming anywhere](#random-naming-anywhere)
15-
- [Integration Testing](#integration-testing)
16-
- [Pre-release procedure](#pre-release-procedure)
179

1810
We happily welcome contributions to databricks-terraform. We use GitHub Issues to track community reported issues and GitHub Pull Requests for accepting changes.
1911

@@ -27,8 +19,6 @@ curl https://raw.githubusercontent.com/databrickslabs/databricks-terraform/maste
2719

2820
## Installing from source
2921

30-
The following command (tested on Ubuntu 20.04) will install `make`, `golang`, `git` with all of the dependent packages as well as Databricks Terrafrom provider from sources. Required version of GoLang is at least 1.13. Required version of terraform is at least 0.12.
31-
3222
On MacOS X, you can install GoLang through `brew install go`, on Debian-based Linux, you can install it by `sudo apt-get install golang -y`.
3323

3424
```bash
@@ -41,6 +31,18 @@ Most likely, `terraform init -upgrade -verify-plugins=false -lock=false` would b
4131

4232
## Developing provider
4333

34+
In order to simplify development workflow, you should use [dev_overrides](https://www.terraform.io/cli/config/config-file#development-overrides-for-provider-developers) section in your `~/.terraformrc` file. Please run `make build` and replace "provider-binary" with the path to `terraform-provider-databricks` executable in your current working directory:
35+
36+
```
37+
$ cat ~/.terraformrc
38+
provider_installation {
39+
dev_overrides {
40+
"databrickslabs/databricks" = "provider-binary"
41+
}
42+
direct {}
43+
}
44+
```
45+
4446
After installing necessary software for building provider from sources, you should install `staticcheck` and `gotestsum` in order to run `make test`.
4547

4648
Make sure you have `$GOPATH/bin` in your `$PATH`:
@@ -63,39 +65,11 @@ Installing `goimports`:
6365
go get golang.org/x/tools/cmd/goimports
6466
```
6567

66-
After this, you should be able to run `make test`.
67-
68-
## Developing with Visual Studio Code Devcontainers
69-
70-
This project has configuration for working with [Visual Studio Code Devcontainers](https://code.visualstudio.com/docs/remote/containers) - this allows you to containerise your development prerequisites (e.g. golang, terraform). To use this you will need [Visual Studio Code](https://code.visualstudio.com/) and [Docker](https://www.docker.com/products/docker-desktop).
71-
72-
To get started, clone this repo and open the folder with Visual Studio Code. If you don't have the [Remote Development extension](https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.vscode-remote-extensionpack) then you should be prompted to install it.
73-
74-
Once the folder is loaded and the extension is installed you should be prompted to re-open the folder in a devcontainer. This will built and run the container image with the correct tools (and versions) ready to start working on and building the code. The in-built terminal will launch a shell inside the container for running `make` commands etc.
75-
76-
See the docs for more details on working with [devcontainers](https://code.visualstudio.com/docs/remote/containers).
77-
78-
## Building and Installing with Docker
68+
After this, you should be able to run `make coverage` to run the tests and see the coverage.
7969

80-
To install and build the code if you dont want to install golang, terraform, etc. All you need is docker and git.
81-
82-
First make sure you clone the repository and you are in the directory.
83-
84-
Then build the docker image with this command:
85-
86-
```bash
87-
$ docker build -t databricks-terraform .
88-
```
70+
## Debugging
8971

90-
Then run the execute the terraform binary via the following command and volume mount. Make sure that you are in the directory
91-
with the terraform code. The following command you can execute the following commands and additional ones as part of
92-
the terraform binary.
93-
94-
```bash
95-
$ docker run -it -v $(pwd):/workpace -w /workpace databricks-terraform init
96-
$ docker run -it -v $(pwd):/workpace -w /workpace databricks-terraform plan
97-
$ docker run -it -v $(pwd):/workpace -w /workpace databricks-terraform apply
98-
```
72+
**TF_LOG=DEBUG terraform apply** allows you to see the internal logs from `terraform apply`.
9973

10074
## Adding a new resource
10175

@@ -125,7 +99,7 @@ Some interesting points to note here:
12599
* `default:X` to set a default value for a field
126100
* `max_items:N` to set the maximum number of items for a multi-valued parameter
127101
* `slice_set` to indicate that a the parameter should accept a set instead of a list
128-
* Do not use bare references to structs in the model; rather, use pointers to structs. Maps and slices are permitted, as well as the following primitive types: int, int32, int64, float64, bool, string.
102+
* Do not use bare references to structs in the model; rather, use pointers to structs. Maps and slices are permitted, as well as the following primitive types: int, int32, int64, float64, bool, string.
129103
See `typeToSchema` in `common/reflect_resource.go` for the up-to-date list of all supported field types and values for the `tf` tag.
130104

131105
*Define the Terraform schema.* This is made easy for you by the `StructToSchema` method in the `common` package, which converts your struct automatically to a Terraform schema, accepting also a function allowing the user to post-process the automatically generated schema, if needed.
@@ -271,10 +245,6 @@ func TestPreviewAccPipelineResource_CreatePipeline(t *testing.T) {
271245
}
272246
```
273247

274-
## Debugging
275-
276-
**TF_LOG=DEBUG terraform apply** allows you to see the internal logs from `terraform apply`.
277-
278248
## Testing
279249

280250
* [Integration tests](scripts/README.md) should be run at a client level against both azure and aws to maintain sdk parity against both apis.
@@ -297,97 +267,12 @@ func TestPreviewAccPipelineResource_CreatePipeline(t *testing.T) {
297267
Please use makefile for linting. If you run `staticcheck` by itself it will fail due to different tags containing same functions.
298268
So please run `make lint` instead.
299269

300-
## Unit testing resources
301-
302-
Eventually, all of resources would be automatically checked for a unit test presence. `TestGenerateTestCodeStubs` is going to fail, when resource has certain test cases missing. Until all existing resources have tests, you can generate stub code, which will be logged to stdout by changing these lines of `generate_test.go` with name of resource you're creating:
303-
304-
```go
305-
for name, resource := range p.ResourcesMap {
306-
if name != "databricks_user" {
307-
continue
308-
}
309-
//...
310-
```
311-
312-
In order to unit test a resource, which runs fast and could be included in code coverage, one should use `ResourceTester`, that launches embedded HTTP server with `HTTPFixture`'s containing all calls that should have been made in given scenario. Some may argue that this is not a pure unit test, because it creates a side effect in form of embedded server, though it's always on different random port, making it possible to execute these tests in parallel. Therefore comments about non-pure unit tests will be ignored, if they use `ResourceTester` helper.
313-
314-
```go
315-
func TestPermissionsCreate(t *testing.T) {
316-
_, err := internal.ResourceTester(t, []qa.HTTPFixture{
317-
{
318-
Method: http.MethodPatch,
319-
// requires full URI
320-
Resource: "/api/2.0/preview/permissions/clusters/abc",
321-
// works with entities, not JSON. Diff is displayed in case of missmatch
322-
ExpectedRequest: AccessControlChangeList{
323-
AccessControlList: []*AccessControlChange{
324-
{
325-
UserName: &TestingUser,
326-
PermissionLevel: "CAN_USE",
327-
},
328-
},
329-
},
330-
},
331-
{
332-
Method: http.MethodGet,
333-
Resource: "/api/2.0/preview/permissions/clusters/abc",
334-
Response: AccessControlChangeList{
335-
AccessControlList: []*AccessControlChange{
336-
{
337-
UserName: &TestingUser,
338-
PermissionLevel: "CAN_MANAGE",
339-
},
340-
},
341-
},
342-
},
343-
{
344-
Method: http.MethodGet,
345-
Resource: "/api/2.0/preview/scim/v2/Me",
346-
Response: User{
347-
UserName: "chuck.norris",
348-
},
349-
},
350-
},
351-
// next argument is function, that creates resource (to make schema for ResourceData)
352-
resourcePermissions,
353-
// state represented as native structure (though a bit clunky)
354-
map[string]interface{}{
355-
"cluster_id": "abc",
356-
"access_control": []interface{}{
357-
map[string]interface{}{
358-
"user_name": TestingUser,
359-
"permission_level": "CAN_USE",
360-
},
361-
},
362-
},
363-
// the last argument is a function, that performs a stage on resource (Create/update/delete/read)
364-
resourcePermissionsCreate)
365-
assert.NoError(t, err, err)
366-
}
367-
```
368-
369-
Each resource should have both unit and integration tests.
370-
371-
## Generating asserts for the first time in test
372-
373-
```go
374-
for k, v := range d.State().Attributes {
375-
fmt.Printf("assert.Equal(t, %#v, d.Get(%#v))\n", v, k)
376-
}
377-
```
378-
379-
## Random naming anywhere
380-
381-
Terraform SDK provides `randomName := acctest.RandStringFromCharSet(10, acctest.CharSetAlphaNum)` for convenient random names generation.
270+
## Developing with Visual Studio Code Devcontainers
382271

383-
## Integration Testing
272+
This project has configuration for working with [Visual Studio Code Devcontainers](https://code.visualstudio.com/docs/remote/containers) - this allows you to containerise your development prerequisites (e.g. golang, terraform). To use this you will need [Visual Studio Code](https://code.visualstudio.com/) and [Docker](https://www.docker.com/products/docker-desktop).
384273

385-
Currently Databricks supports two cloud providers `azure` and `aws` thus integration testing with the correct cloud service provider is
386-
crucial for making sure that the provider behaves as expected on all supported clouds. Please read [dedicated instructions](scripts/README.md) for details.
274+
To get started, clone this repo and open the folder with Visual Studio Code. If you don't have the [Remote Development extension](https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.vscode-remote-extensionpack) then you should be prompted to install it.
387275

388-
## Pre-release procedure
276+
Once the folder is loaded and the extension is installed you should be prompted to re-open the folder in a devcontainer. This will built and run the container image with the correct tools (and versions) ready to start working on and building the code. The in-built terminal will launch a shell inside the container for running `make` commands etc.
389277

390-
1. `make test-azure`
391-
2. `make test-mws` if MWS related code changed given release.
392-
3. Create release notes.
393-
4. Perfrom backwards-compatibility checks and make proper notes.
278+
See the docs for more details on working with [devcontainers](https://code.visualstudio.com/docs/remote/containers).

0 commit comments

Comments
 (0)