Some information can be found in the docs for developing a new provider.
As a prerequisite, in aws-sdk-go-v2 ensure API calls exist to list/describe the desired resource, and make note of:
- to which aws service the resource belongs
- the schema of the returned object(s)
If the service to which the resource belongs has not been used before in cq-provider-aws, there are a few steps that need to be done to configure it.
- Create the service interface in client/services.go
- Don't forget to add the new service interface name to the go:generate comment.
- Add the service to the
Servicesstruct in the client/client.go - Init the service in the
initServicesfunction in client/client.go - Run
go generate client/services.goto create a mock for your new service. This will update client/mocks/mock_<service>.go automatically
If you get an error about not being able to find
mockgen, runmake install-toolsto install it. If it still fails, runexport PATH=${PATH}:go env GOPATH/binin you shell to set up yourPATHenvironment properly
You might need to update an existing AWS client by running
go get github.com/aws/aws-sdk-go-v2/service/<service-name>@latestand thengo mod tidy
- In client/services.go, update the service interface and add the method(s) that you will be using to fetch the data from the aws sdk.
- Run
go generate client/services.goto create a mock for your new methods. This will update client/mocks/mock_<service>.go automatically. - Create a file under resources/services/<service> that follows the pattern of
<resource>.go. - In that file, create a function that returns a
*schema.Table. - In resources/provider.go, add a mapping between the function you just created and the name of the resource that will be used in the config yml file.
- Add a test file at resources/services/<service>/<resource>_mock_test.go. Follow other examples to create a test for the resource.
- Run
go run docs/docs.goto generate the documentation for the new resource.
Now that the skeleton has been set up, you can start to actually implement the resource. This consists of two parts:
- Defining the schema
- Implementing resolver functions
It is recommended that you look at a few existing resources as examples and also read through the comments on the source code for the Table and Column implementations for details.
For simple fields, the SDK can directly resolve them into Columns for you, so all you need to do is specify the Name and the Type.
For complex fields or fields that require further API calls, you can define your own Resolver for the Column.
A few important things to note when adding functions that call the AWS API:
- If possible, always use an API call that allows you to fetch many resources at once
- Take pagination into account. Ensure you fetch all of the resources
To prepare your environment for running integration tests:
# Start Postgres in a Docker container:
docker run -p 5432:5432 -e POSTGRES_PASSWORD=pass -d postgres:latest
# Login with AWS.
aws configure ssoNote: You can also use AWS CLI profiles and environment variables. See all options at CloudQuery Hub.
To run an integration test for a specific table:
go test -run="TestIntegration/RESOURCE_NAME" -tags=integration ./...
# For example
go test -run="TestIntegration/aws_lambda_functions" -tags=integration ./...Note: You can override the Postgres database URL used for integration tests by specifying a DATABASE_URL environment variable, for example:
export DATABASE_URL="host=localhost user=postgres password=pass DB.name=postgres port=5432"
To run all integration tests:
go test -run=TestIntegration -tags=integration ./...Important When adding a single resource, it's more common to only run the integration tests for that specific resource. You'll need to ensure your resource has the relevant Terraform service deployed.
Terraform files are organized under the terraform folder, and each service has its own folder.
Under each service folder, we organize files into 3 folders:
local: When testing locally run the Terraform CLI from heremodules/tests: Terraform resource and module definitions go hereprod: This folder is used for our CI testing. See relevant scripts here. Not to be used locally
Each service has its own Terraform to follow best practices. It allows creating a test environment for each service, and avoids slowdowns and memory issues if we would have had a single Terraform file for all services.
There are a few good rules of thumb to follow when creating new terraform resources that will be served as testing infrastructure:
- If possible make all resources private.
- Make sure to replace built-in plain text passwords with
random_passwordgenerator - For every compute/db try to use the smallest size to keep the cost low
- If autoscaling option is present, always turn it off
If you want to apply the Terraform locally first before pushing it to CI and applying there, use:
cd terraform/YOUR_SERVICE_NAME/local
terraform init
# Replace AB with your own initials so multiple team members can work on the same account without conflicting resources
terraform apply -var="prefix=AB"
go test -run="TestIntegration/RESOURCE_NAME" -tags=integration ./...