By default the CLI suppresses the verbose output of the different Terraform commands. However, they might be quite useful, when it comes to analyzing issues. For that you can add the --verbose flag to any command of the CLI, which will result in the full output of any cmd.exec() execution suppressing any "UI candy" like spinners.
We provide a configuration for debugging the btptf commands in VS Code. The configuration is available in the .vscode directory as launch.json
Here is an example of how to debug the command btptf create-json:
-
Set a breakpoint in the file
cmd/createJson.goin the run section of the command:
-
Adjust the
launch.jsonconfiguration to consider your environment variable values. The default is an.envfile in the root folder (=working directory) of the repository:
[!WARNING] The environment values will be displayed as clear text in the debug console. If you are using your password as an environment parameter this will become visible when you start debugging. We therefore highly recommend using the env-file option.
-
Open the debug perspective in the VS Code side bar:
-
Select the configuration
Debug CLI command:
-
Run the selection by pressing the green triangle:
-
VS Code will prompt you for the command via the command palette. It defaults to
resource all -s. Enter the command and the parameters that you want to use for the command execution. In our case we add a subaccount ID and confirm by pressingEnter:
-
The debugger will start and hit the breakpoint:
Happy debugging!
When updating command descriptions, you must generate the markdown documentation via the make file:
make docsTo enable new resources on the subaccount level, you must execute the following steps:
- Add the corresponding constants for the command parameter and the technical resource name in the
tfutils/tfutils.gofile. - Add the mapping of the constants into the function
TranslateResourceParamToTechnicalNamein thetfutils/tfutils.gofile. - Add the command constant to the slice of
AllowedResourcesin thetfutil/tfconfig.gofile. - Create a new implementation for the import factory in the directory
tfimportprovider. You can take the filesubaccountRoleCollectionImportProvider.goas an example concerning the structure of the file. - Add the new implementation to the import factory function
GetImportBlockProviderin the filetfimportprovider/tfImportProviderFactory.go. - Depending on the resource, you must define a transformation of the data from the data source to a string array. Place this logic into the function
transformDataToStringArrayin thetfutils/tfutils.gofile. - Depending on your resource you might also need to add custom formatting logic for the resource address in the Terraform configuration. Place that into the file
output/format.go. In most cases the functionFormatResourceNameGenericis sufficient.
The main domain logic that we must test is located in the factory implementations in the directory tfimportprovider. Creating these tests should reflect the real-world setup, so we need to extract the test data from subaccounts and store them in the tests. In the following sections, we describe how to best extract this data namely the JSON string that you need as input for your test.
As a prerequisite you should have a Terraform account with the resource that you want to cover in your test up and running. We will use Terraform to extract the base data.
First create a Terraform setup that allows you to read the data via a data source. The basic setup could look like this:
provider.tf:
terraform {
required_providers {
btp = {
source = "SAP/btp"
version = "~>1.19.0"
}
}
}
provider "btp" {
globalaccount = "<YOU GLOBALACCOUNT SUBDOMAIN>"
}- Assuming we want to fetch subscriptions the
main.tfwould look like this
data "btp_subaccount_subscriptions" "all" {
subaccount_id = "<YOUR SUBACCOUNT ID>"
}
output "all" {
value = data.btp_subaccount_subscriptions.all
}
Next we execute a planning and store the plan file:
terraform plan -out plan.outyou have two options now:
- If you want to create a JSON string with all the resources contained in the
plan.out, execute the scriptguidelines/scripts/transform_all.shthat needs to be located at the same level as theplan.outfile. - If you want to adjust the result you must execute the following steps:
- Generate the JSON file via: Terraform show -json plan.out | jq .planned_values.outputs.all.value > restrictedplan.json
- Adjust the JSON file e.g., remove some entries
- execute the script
guidelines/scripts/transform_json.shthat needs to be located at the same level as therestrictedplan.jsonfile.
With that you get a file that contains the JSON string that you can use as input for your tests of the creation of the import block functions.
An example how to create test case is given by the unit test implemented in tfimportprovider/subaccountSubscriptionImportProvider_test.go.
GitHub Copilot can be quite useful to setup the basics for the test, but some rework is needed.
We use the custom templating option available in the Cobra Framework to construct the output in the console. The override of the default templating flow is triggered in the commands via the function SetHelpTemplate and SetUsageTemplate.
In general, we call the generateCmdHelp function to generate the output that will be displayed in the console. The generateCmdHelp function gets the command as well as a structure of the type generateCmdHelpOptions.
If the command receives an empty structure, it will call several default functions to create the console help. However, you have the option to override the single section by providing a custom function that crafts the string used in the console help.
You find an example for this setup in the command exportByResourceCmd. Be aware that the code leverages several helper functions that are available in the file cmdDocsHelper.go.
-
Create a file
devcontainer.envin the.devcontainerdirectory -
Add the environment variables in the file. Here is an example:
BTP_USERNAME='<MY SAP BTP USERNAME>' BTP_PASSWORD='<MY SAP BTP PASSWORD>' BTP_GLOBALACCOUNT='<MY SAP BTP GLOBAL ACCOUNT SUBDOMAIN>' #optional
-
Start the devcontainer option
Terraform exporter for SAP BTP - Development (with env file). The environment variables defined in thedevcontainer.envfile will be automatically injected. -
Alternative via
.envfile (available on MacOS and Linux only): -
Create a file
.envin the root of the project -
Add the environment variables in the file. Here is an example:
BTP_USERNAME='<MY SAP BTP USERNAME>' BTP_PASSWORD='<MY SAP BTP PASSWORD>' BTP_GLOBALACCOUNT='<MY SAP BTP GLOBAL ACCOUNT SUBDOMAIN>'
-
Execute the following command in a terminal:
export $(xargs <.env)
!!! info There is no predefined functionality in PowerShell to achieve the same. A custom script is needed.