Skip to content

Latest commit

 

History

History
169 lines (108 loc) · 8.02 KB

File metadata and controls

169 lines (108 loc) · 8.02 KB

Developer Guide

Debugging Output

By default the CLI suppresses the verbose output of the different Terraform commands. However, they might be quite useful, when it comes to analyzing issues. For that you can add the --verbose flag to any command of the CLI, which will result in the full output of any cmd.exec() execution suppressing any "UI candy" like spinners.

Debug the CLI

We provide a configuration for debugging the btptf commands in VS Code. The configuration is available in the .vscode directory as launch.json

Here is an example of how to debug the command btptf create-json:

  1. Set a breakpoint in the file cmd/createJson.go in the run section of the command:

    Set a breakpoint in VS Code
  2. Adjust the launch.json configuration to consider your environment variable values. The default is an .env file in the root folder (=working directory) of the repository:

    VS Code Debug launch configuration

    [!WARNING] The environment values will be displayed as clear text in the debug console. If you are using your password as an environment parameter this will become visible when you start debugging. We therefore highly recommend using the env-file option.

  3. Open the debug perspective in the VS Code side bar:

    VS Code Side bar
  4. Select the configuration Debug CLI command:

    VS Code debug configuration options
  5. Run the selection by pressing the green triangle:

    Run debug configuration
  6. VS Code will prompt you for the command via the command palette. It defaults to resource all -s. Enter the command and the parameters that you want to use for the command execution. In our case we add a subaccount ID and confirm by pressing Enter:

    Prompt for parameters in debug configuration
  7. The debugger will start and hit the breakpoint:

    VS Code hitting breakpoint

Happy debugging!

Generate Markdown Documentation

When updating command descriptions, you must generate the markdown documentation via the make file:

make docs

Adding support for New Resources on Subaccount Level

To enable new resources on the subaccount level, you must execute the following steps:

  1. Add the corresponding constants for the command parameter and the technical resource name in the tfutils/tfutils.go file.
  2. Add the mapping of the constants into the function TranslateResourceParamToTechnicalName in the tfutils/tfutils.go file.
  3. Add the command constant to the slice of AllowedResources in the tfutil/tfconfig.go file.
  4. Create a new implementation for the import factory in the directory tfimportprovider. You can take the file subaccountRoleCollectionImportProvider.go as an example concerning the structure of the file.
  5. Add the new implementation to the import factory function GetImportBlockProvider in the file tfimportprovider/tfImportProviderFactory.go.
  6. Depending on the resource, you must define a transformation of the data from the data source to a string array. Place this logic into the function transformDataToStringArray in the tfutils/tfutils.go file.
  7. Depending on your resource you might also need to add custom formatting logic for the resource address in the Terraform configuration. Place that into the file output/format.go. In most cases the function FormatResourceNameGeneric is sufficient.

Adding Unit Tests

The main domain logic that we must test is located in the factory implementations in the directory tfimportprovider. Creating these tests should reflect the real-world setup, so we need to extract the test data from subaccounts and store them in the tests. In the following sections, we describe how to best extract this data namely the JSON string that you need as input for your test.

Prerequisites

As a prerequisite you should have a Terraform account with the resource that you want to cover in your test up and running. We will use Terraform to extract the base data.

Extracting the Data

First create a Terraform setup that allows you to read the data via a data source. The basic setup could look like this:

  • provider.tf:
terraform {

  required_providers {
    btp = {
      source  = "SAP/btp"
      version = "~>1.19.0"
    }
  }
}

provider "btp" {
  globalaccount  = "<YOU GLOBALACCOUNT SUBDOMAIN>"
}
  • Assuming we want to fetch subscriptions the main.tfwould look like this
data "btp_subaccount_subscriptions" "all" {
  subaccount_id = "<YOUR SUBACCOUNT ID>"
}

output "all" {
  value = data.btp_subaccount_subscriptions.all
}

Next we execute a planning and store the plan file:

terraform plan -out plan.out

you have two options now:

  • If you want to create a JSON string with all the resources contained in the plan.out, execute the script guidelines/scripts/transform_all.sh that needs to be located at the same level as the plan.out file.
  • If you want to adjust the result you must execute the following steps:
    1. Generate the JSON file via: Terraform show -json plan.out | jq .planned_values.outputs.all.value > restrictedplan.json
    2. Adjust the JSON file e.g., remove some entries
    3. execute the script guidelines/scripts/transform_json.sh that needs to be located at the same level as the restrictedplan.json file.

With that you get a file that contains the JSON string that you can use as input for your tests of the creation of the import block functions.

Creating the Unit Test

An example how to create test case is given by the unit test implemented in tfimportprovider/subaccountSubscriptionImportProvider_test.go. GitHub Copilot can be quite useful to setup the basics for the test, but some rework is needed.

Creating Console Help

We use the custom templating option available in the Cobra Framework to construct the output in the console. The override of the default templating flow is triggered in the commands via the function SetHelpTemplate and SetUsageTemplate.

In general, we call the generateCmdHelp function to generate the output that will be displayed in the console. The generateCmdHelp function gets the command as well as a structure of the type generateCmdHelpOptions.

If the command receives an empty structure, it will call several default functions to create the console help. However, you have the option to override the single section by providing a custom function that crafts the string used in the console help.

You find an example for this setup in the command exportByResourceCmd. Be aware that the code leverages several helper functions that are available in the file cmdDocsHelper.go.

Setting Environment Variables in a Development Container

  • Create a file devcontainer.env in the .devcontainer directory

  • Add the environment variables in the file. Here is an example:

    BTP_USERNAME='<MY SAP BTP USERNAME>'
    BTP_PASSWORD='<MY SAP BTP PASSWORD>'
    BTP_GLOBALACCOUNT='<MY SAP BTP GLOBAL ACCOUNT SUBDOMAIN>' #optional
  • Start the devcontainer option Terraform exporter for SAP BTP - Development (with env file). The environment variables defined in the devcontainer.env file will be automatically injected.

  • Alternative via .env file (available on MacOS and Linux only):

  • Create a file .env in the root of the project

  • Add the environment variables in the file. Here is an example:

    BTP_USERNAME='<MY SAP BTP USERNAME>'
    BTP_PASSWORD='<MY SAP BTP PASSWORD>'
    BTP_GLOBALACCOUNT='<MY SAP BTP GLOBAL ACCOUNT SUBDOMAIN>'
  • Execute the following command in a terminal:

    export $(xargs <.env)

!!! info There is no predefined functionality in PowerShell to achieve the same. A custom script is needed.