|
| 1 | +# Dry Run Mode |
| 2 | + |
| 3 | +As your test suits get bigger and complicated over the period of time, it is essential that the toolings used for creating tests provide an easy way to identify and list the tests being |
| 4 | +processed as part of your framework when invoked with certain arguments. And this listing needs |
| 5 | +to be quick and clean in order to enable quick turn around time of test development. This requirement bring in the need to introduce a `dry-run` behavior into the `e2e-framework`. |
| 6 | + |
| 7 | +## Unit Of Test |
| 8 | + |
| 9 | +Go treats each function starting with `Testxxx` as the Test unit. However, the same is not entirely true in case of the `e2e-framework`. This introduces dynamic tests that are generated during the runtime programmatically for each assessment of each feature. |
| 10 | + |
| 11 | +From the perspective of the `e2e-framework`, the Unit of test is an `Assessment` that actually performs the assertion of an |
| 12 | +expected behavior or state of the system. These assessments are run as a sub-test of the main test identified by the function |
| 13 | +`Testxxx`. All framework specific behaviors built around this fundamental test unit of `Assessment`. |
| 14 | + |
| 15 | +## Why not use `test.list` from `go test` ? |
| 16 | + |
| 17 | +The `test.list` is a great way to run the dry-run equivalent behavior. However, it is not easily extendable into the core of `e2e-framework` as |
| 18 | +there are framework specific behavior such as `setup` and `teardown` workflows. |
| 19 | + |
| 20 | +That, in conjunction with how the `test.list` works, it is not possible to extract information such as the `assessments` in the feature using the `test.list` mode brings the need to introduce a framework specific `dry-run` mode that can work well with `test.list` while providing all |
| 21 | +the framework specific benefits of how the Tests to be processed can be listed |
| 22 | + |
| 23 | +## `--dry-run` mode |
| 24 | +`e2e-framework` adds a new CLI flag that can be used while invoking the test called `--dry-run`. This works in conjunction with `test.list` to provide the following behavior. |
| 25 | + |
| 26 | +1. When the `--dry-run` mode is invoked No Setup/Teardown workflows are processed |
| 27 | +2. Will display the Assessments as individual tests like they would be processed if not invoked with `--dry-run` mode |
| 28 | +3. Skip all pre-post actions around the Before/After Features or Before/After Tests |
| 29 | + |
| 30 | +When tests are invoked with `-test.list` argument, the `--dry-run` mode is automatically switched to enabled to make sure setup/teardown as well as the pre-post actions can be skipped. |
| 31 | + |
| 32 | +## Example Output with `--dry-run` |
| 33 | +```bash |
| 34 | +❯ go test . -test.v -args --dry-run |
| 35 | +=== RUN TestPodBringUp |
| 36 | +=== RUN TestPodBringUp/Feature_One |
| 37 | +=== RUN TestPodBringUp/Feature_One/Create_Nginx_Deployment_1 |
| 38 | +=== RUN TestPodBringUp/Feature_One/Wait_for_Nginx_Deployment_1_to_be_scaled_up |
| 39 | +=== RUN TestPodBringUp/Feature_Two |
| 40 | +=== RUN TestPodBringUp/Feature_Two/Create_Nginx_Deployment_2 |
| 41 | +=== RUN TestPodBringUp/Feature_Two/Wait_for_Nginx_Deployment_2_to_be_scaled_up |
| 42 | +--- PASS: TestPodBringUp (0.00s) |
| 43 | + --- PASS: TestPodBringUp/Feature_One (0.00s) |
| 44 | + --- PASS: TestPodBringUp/Feature_One/Create_Nginx_Deployment_1 (0.00s) |
| 45 | + --- PASS: TestPodBringUp/Feature_One/Wait_for_Nginx_Deployment_1_to_be_scaled_up (0.00s) |
| 46 | + --- PASS: TestPodBringUp/Feature_Two (0.00s) |
| 47 | + --- PASS: TestPodBringUp/Feature_Two/Create_Nginx_Deployment_2 (0.00s) |
| 48 | + --- PASS: TestPodBringUp/Feature_Two/Wait_for_Nginx_Deployment_2_to_be_scaled_up (0.00s) |
| 49 | +PASS |
| 50 | +ok sigs.k8s.io/e2e-framework/examples/parallel_features 0.353s |
| 51 | +``` |
| 52 | + |
| 53 | +```bash |
| 54 | +❯ go test . -test.v -args --dry-run --assess "Deployment 1" |
| 55 | +=== RUN TestPodBringUp |
| 56 | +=== RUN TestPodBringUp/Feature_One |
| 57 | +=== RUN TestPodBringUp/Feature_One/Create_Nginx_Deployment_1 |
| 58 | +=== RUN TestPodBringUp/Feature_One/Wait_for_Nginx_Deployment_1_to_be_scaled_up |
| 59 | +=== RUN TestPodBringUp/Feature_Two |
| 60 | +=== RUN TestPodBringUp/Feature_Two/Create_Nginx_Deployment_2 |
| 61 | + env.go:425: Skipping assessment "Create Nginx Deployment 2": name not matched |
| 62 | +=== RUN TestPodBringUp/Feature_Two/Wait_for_Nginx_Deployment_2_to_be_scaled_up |
| 63 | + env.go:425: Skipping assessment "Wait for Nginx Deployment 2 to be scaled up": name not matched |
| 64 | +--- PASS: TestPodBringUp (0.00s) |
| 65 | + --- PASS: TestPodBringUp/Feature_One (0.00s) |
| 66 | + --- PASS: TestPodBringUp/Feature_One/Create_Nginx_Deployment_1 (0.00s) |
| 67 | + --- PASS: TestPodBringUp/Feature_One/Wait_for_Nginx_Deployment_1_to_be_scaled_up (0.00s) |
| 68 | + --- PASS: TestPodBringUp/Feature_Two (0.00s) |
| 69 | + --- SKIP: TestPodBringUp/Feature_Two/Create_Nginx_Deployment_2 (0.00s) |
| 70 | + --- SKIP: TestPodBringUp/Feature_Two/Wait_for_Nginx_Deployment_2_to_be_scaled_up (0.00s) |
| 71 | +PASS |
| 72 | +ok sigs.k8s.io/e2e-framework/examples/parallel_features 0.945s |
| 73 | +``` |
| 74 | + |
| 75 | +## Example with `-test.list` |
| 76 | +```bash |
| 77 | +❯ go test . -test.v -test.list ".*" -args |
| 78 | +TestPodBringUp |
| 79 | +ok sigs.k8s.io/e2e-framework/examples/parallel_features 0.645s |
| 80 | +``` |
| 81 | + |
| 82 | +As you can see from the above two examples, the output of the two commands are not really the same. Using `--dry-run` gives you a more framework specific behavior of how the tests are going to be processed in comparison to `-test.list` |
| 83 | + |
0 commit comments