Skip to content
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
157 changes: 121 additions & 36 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -1,60 +1,145 @@
# Typical development workflow
# Contributing

Fork the repo, work on an issue
This guide explains how to set up your environment, make changes, and submit a PR.

## Updating the generated Kibana client.
## Development Setup

If your work involves the Kibana API, the endpoints may or may not be included in the generated client.
Check [generated/kbapi](./generated/kbapi/) for more details.
* Fork and clone the repo.
* Setup your preferred IDE (Intelliji, VSCode, etc.)

## Acceptance tests
Requirements:
* [Terraform](https://www.terraform.io/downloads.html) >= 1.0.0
* [Go](https://golang.org/doc/install) >= 1.25
* Docker (for acceptance tests)

```bash
make docker-testacc
```
## Development Workflow

Run a single test with terraform debug enabled:
```bash
env TF_LOG=DEBUG make docker-testacc TESTARGS='-run ^TestAccResourceDataStreamLifecycle$$'
```
* Create a new branch for your changes.
* Make your changes. See [Useful Commands](#useful-commands) and [Debugging](#debugging).
* GitHub Copilot can be used to help with these changes (See [Using Copilot](#using-copilot))
* Validate your changes
* Run unit and acceptance tests (See [Running Acceptance Tests](#running-acceptance-tests)).
* Run `make fmt`, `make lint`.
* All checks also run automatically on every PR.
* Add a changelog entry in `CHANGELOG.md` under the `Unreleased` section. This will be included in the release notes of the next release.
* Submit your PR for review.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Currently changelog entries link back to the PR introducing the change which means we first need to open the PR and then add the changelog entry. That situation has never really made sense, but it does make sense to be consistent for now.

IMO rather than improving the ordering here, it would likely make more sense to look at automatically adding changelog entries based on the PR title.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll reorder it for now. Though it would be nice to have this automated so the release job just collects release notes from all PRs included in the release.


A way to forward debug logs to a file:
```bash
env TF_ACC_LOG_PATH=/tmp/tf.log TF_ACC_LOG=DEBUG TF_LOG=DEBUG make docker-testacc
```
When creating new resources:
* Use the [Plugin Framework](https://developer.hashicorp.com/terraform/plugin/framework/getting-started/code-walkthrough) for new resources.
* Use an existing resource (e.g. `internal/elasticsearch/security/system_user`) as a template.
* Some resources use the deprecated Terraform SDK, so only resources using the new Terraform Framework should be used as reference.
* Use the generated API clients to interact with the Kibana APIs. (See [Working with Generated API Clients](#working-with-generated-api-clients)
* Add documentation and examples for the resource. Update the generated docs with `make docs-generate`.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We also need to add the doc template (example) as part of this work.

The docs generator doesn't (or didn't, I haven't checked recently) populate the subcategory field in the frontmatter which we're using to improve the docs organisation.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes good point.

* Write unit and acceptance tests.


## Update documentation
## Using Copilot

Update documentation templates in `./templates` directory and re-generate docs via:
```bash
make docs-generate
```
GitHub Copilot can speed up development, but you’re responsible for correctness:
* Create an issue describing the desired change and acceptance criteria.
* Assign the issue to Copilot and iterate with prompts.
* Review outputs carefully; add tests and adjust as needed.
* Example issue: https://github.com/elastic/terraform-provider-elasticstack/issues/1219
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So this is kind of correct, but with some limitations.

  • Contributions should already have an open issue, people shouldn't be creating an issue just to get coding agent involved. We also want to create some guidance about the content of that issue, we should aim to have some example issues which have provided good prompts to coding agent.
  • There's restrictions on who can assign an issue to Copilot. We need to make sure folks have a clear path there
  • IMO this section is targeted purely at Elasticians, whilst the rest of this doc mostly makes sense for any community members. I think there's enough content here (i.e 'Adding support for Stack features via Copilot') to have a dedicated internal doc.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll remove it from this doc for now.

Should we have a separate doc in the repo for internal docs? The release process is also something that only applies to Elasticians.


## Update `./CHANGELOG.md`
### Useful Commands

List of previous commits is a good example of what should be included in the changelog.
* `make build`: Build the provider.
* `make install`: Install the provider.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't know why we maintain this make target tbh, I've never used it and would suggest that using it as part of a development workflow is broken. IMO we shouldn't mention this as part of a contributing guide.

Should we remove the make target? It depends, I guess it makes sense if any users are building the provider from source. Do we need to make that workflow seamless though...

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah I also never use it. The default workflow should probably be to use the debugging env-var. I'll remove it from the guide.

* `make fmt`: Format the code.
* `make lint`: Lint the code.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
* `make fmt`: Format the code.
* `make lint`: Lint the code.
* `make lint`: Lint the code.

Lint also formats

* `make test`: Run unit tests.
* `make docs-generate`: Generate documentation.

#### Running Acceptance Tests

## Pull request
Acceptance tests spin up Elasticsearch, Kibana, and Fleet with Docker and run tests in a Go container.

Format the code before pushing:
Quick start (default stack version from Makefile):
```bash
make fmt
make docker-testacc
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This path is can be quite slow, it both spins up the Elastic Stack docker containers (Yay!), but also runs the actual provider tests inside a golang docker container.

Repeated testing (i.e what we'd expect in a dev workflow) is much faster if we re-use a single running Stack environment but also run the tests directly on the dev machine.

Suggested change
make docker-testacc
make docker-fleet
make testacc

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sounds good, I updated the docs to use this faster workflow.

```

Check if the linting:
Run a single test with terraform debug enabled:
```bash
make lint
env TF_LOG=DEBUG make docker-testacc TESTARGS='-run ^TestAccResourceDataStreamLifecycle$$'
```

Create a PR and check acceptance test matrix is green.

## Run provider with local terraform

TBD
A way to forward debug logs to a file:
```bash
env TF_ACC_LOG_PATH=/tmp/tf.log TF_ACC_LOG=DEBUG TF_LOG=DEBUG make docker-testacc
```

## Releasing
### Working with Generated API Clients

If your work involves the Kibana API, the API client can be generated directly from the Kibana OpenAPI specs:
- For Kibana APIs, use the generated client in `generated/kbapi`.
- To add new endpoints, see [generated/kbapi/README.md](generated/kbapi/README.md).
- Regenerate clients with:
```sh
make transform generate
```

The codebase includes a number of deprecated clients which should not be used anymore:
- `libs/go-kibana-rest`: Fork of an external library, which is not maintained anymore.
- `generated/alerting`
- `generated/connectors`
- `generated/slo`
- `internal/clients/*`: Manually written clients. These should only be used/extended if it is not possible to use the generated clients.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
- `internal/clients/*`: Manually written clients. These should only be used/extended if it is not possible to use the generated clients.

This isn't really true. Much of this package configures one of the Stack clients, providing endpoints/credentials/etc from the provider configuration to the client.

There's some code in here which is used as a bridge between the provider resources and the older Kibana/Elasticsearch clients. We don't need to continue with that as much, but there are still cases where it may make sense, for instance in Kibana connectors have a bunch of non-trivial logic managing API defaults which kind of makes sense in a client layer.

I don't think there's a need for a hard rule on not making additions here.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

True, I didn't realize this also contains all the wrapper code for the generated client. I'll remove that entry.


### Updating Documentation

Docs are generated from templates in `templates/` and examples in `examples/`.
* Update or add templates and examples.
* Run `make docs-generate` to produce files under `docs/`.
* Commit the generated files. `make lint` will fail if docs are stale.

### Debugging

Run the provider in debug mode and reattach the provider in Terraform:
* Launch `main.go` with the `-debug` flag from your IDE.
* After launching, the provider will print an env var. Copy the printed `TF_REATTACH_PROVIDERS='{…}'` value.
* Export it in your shell where you run Terraform: `export TF_REATTACH_PROVIDERS='{…}'`.
* Terraform will now talk to your debug instance, and you can set breakpoints.

## Project Structure

A quick overview over what's in each folder:

* `docs/` - Documentation files
* `data-sources/` - Documentation for Terraform data sources
* `guides/` - User guides and tutorials
* `resources/` - Documentation for Terraform resources
* `examples/` - Example Terraform configurations
* `cloud/` - Examples using the cloud to launch testing stacks
* `data-sources/` - Data source usage examples
* `resources/` - Resource usage examples
* `provider/` - Provider configuration examples
* `generated/` - Auto-generated clients from the `generate-clients` make target
* `kbapi/` - Kibana API client
* `alerting/` - (Deprecated) Kibana alerting API client
* `connectors/` - (Deprecated) Kibana connectors API client
* `slo/` - (Deprecated) SLO (Service Level Objective) API client
* `internal/` - Internal Go packages
* `acctest/` - Acceptance test utilities
* `clients/` - API client implementations
* `elasticsearch/` - Elasticsearch-specific logic
* `fleet/` - Fleet management functionality
* `kibana/` - Kibana-specific logic
* `models/` - Data models and structures
* `schema/` - Connection schema definitions for plugin framework
* `utils/` - Utility functions
* `versionutils/` - Version handling utilities
* `libs/` - External libraries
* `go-kibana-rest/` - (Deprecated) Kibana REST API client library
* `provider/` - Core Terraform provider implementation
* `scripts/` - Utility scripts for development and CI
* `templates/` - Template files for documentation generation
* `data-sources/` - Data source documentation templates
* `resources/` - Resource documentation templates
* `guides/` - Guide documentation templates
* `xpprovider/` - Additional provider functionality needed for Crossplane

## Releasing (maintainers)

Releasing is implemented in CI pipeline.

Expand All @@ -65,4 +150,4 @@ To release a new provider version:
- updates CHANGELOG.md with the list of changes being released.
[Example](https://github.com/elastic/terraform-provider-elasticstack/commit/be866ebc918184e843dc1dd2f6e2e1b963da386d).

* Once the PR is merged, the release CI pipeline can be started by pushing a new release tag to the `main` branch.
* Once the PR is merged, the release CI pipeline can be started by pushing a new release tag to the `main` branch. (`git tag v0.11.13 && git push origin v0.11.13`)
57 changes: 1 addition & 56 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -76,64 +76,9 @@ provider "elasticstack" {
}
```


## Developing the Provider

If you wish to work on the provider, you'll first need [Go](http://www.golang.org) installed on your machine (see [Requirements](#requirements)).

To compile the provider, run `go install`. This will build the provider and put the provider binary in the `$GOPATH/bin` directory.

To install the provider locally into the `~/.terraform.d/plugins/...` directory one can use `make install` command. This will allow to refer this provider directly in the Terraform configuration without needing to download it from the registry.

To generate or update documentation, run `make gen`. All the generated docs will have to be committed to the repository as well.

In order to run the full suite of Acceptance tests, run `make testacc`.

If you have [Docker](https://docs.docker.com/get-docker/) installed, you can use following command to start the Elasticsearch container and run Acceptance tests against it:

```sh
$ make docker-testacc
```

To clean up the used containers and to free up the assigned container names, run `make docker-clean`.

Note: there have been some issues encountered when using `tfenv` for local development. It's recommended you move your version management for terraform to `asdf` instead.


### Requirements

- [Terraform](https://www.terraform.io/downloads.html) >= 1.0.0
- [Go](https://golang.org/doc/install) >= 1.19


### Building The Provider

1. Clone the repository
1. Enter the repository directory
1. Build the provider using the `make install` command:
```sh
$ make install
```


### Adding Dependencies

This provider uses [Go modules](https://github.com/golang/go/wiki/Modules).
Please see the Go documentation for the most up to date information about using Go modules.

To add a new dependency `github.com/author/dependency` to your Terraform provider:

```
go get github.com/author/dependency
go mod tidy
```

Then commit the changes to `go.mod` and `go.sum`.

### Generating Kibana clients

Kibana clients for some APIs are generated based on Kibana OpenAPI specs.
Please see [Makefile](./Makefile) tasks for more details.
See [CONTRIBUTING.md](CONTRIBUTING.md)

## Support

Expand Down
Loading