You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/side_quests/nf-test.md
+35-10Lines changed: 35 additions & 10 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -406,7 +406,9 @@ Success! The pipeline runs successfully and the test passes. Now we have began t
406
406
407
407
## 1.4. Test the output
408
408
409
-
Let's add an assertion to our test to check the output file was created. We'll also update the test name again to reflect that we're now checking both process execution and output files.
409
+
Let's add an assertion to our test to check the output file was created.
410
+
411
+
We can also add it as a separate test, with an informative name.
410
412
411
413
**Before:**
412
414
@@ -441,6 +443,19 @@ Let's add an assertion to our test to check the output file was created. We'll a
Test [1d4aaf12] 'Should run successfully with correct processes and output files' PASSED (1.591s)
486
+
Test [f0e08a68] 'Should run without failures' PASSED (8.144s)
487
+
Test [d7e32a32] 'Should produce correct output files' PASSED (6.994s)
472
488
473
489
474
-
SUCCESS: Executed 1 tests in 1.612s
490
+
SUCCESS: Executed 2 tests in 15.165s
475
491
```
476
492
477
-
Success! The test passes because the pipeline completed successfully, the correct number of processes ran and the output files were created.
493
+
Success! The test passes because the pipeline completed successfully, the correct number of processes ran and the output files were created. This should also show you how useful it is to provide those informative names for your tests.
478
494
479
495
This is just the surface, we can keep writing assertions to check the details of the pipeline, but for now let's move on to testing the internals of the pipeline.
480
496
@@ -1014,11 +1030,12 @@ Test Process convertToUpper
1014
1030
1015
1031
Test Workflow main.nf
1016
1032
1017
-
Test [1d4aaf12] 'Should run successfully with correct processes and output files' PASSED (1.652s)
1033
+
Test [f0e08a68] 'Should run without failures' PASSED (8.144s)
1034
+
Test [d7e32a32] 'Should produce correct output files' PASSED (6.994s)
1018
1035
1019
1036
Test Process sayHello
1020
1037
1021
-
Test [f91a1bcd] 'Should run without failures and produce correct output' PASSED (1.664s)
1038
+
Test [f91a1bcd] 'Should run without failures and contain expected greeting' PASSED (1.664s)
1022
1039
1023
1040
1024
1041
SUCCESS: Executed 3 tests in 5.007s
@@ -1031,17 +1048,24 @@ Check that out! We ran 3 tests, 1 for each process and 1 for the whole pipeline
1031
1048
In this side quest, we've learned:
1032
1049
1033
1050
1. How to initialize nf-test in a Nextflow project
1034
-
2. How to write and run pipeline-level tests
1051
+
2. How to write and run pipeline-level tests:
1052
+
- Basic success testing
1053
+
- Process count verification
1054
+
- Output file existence checks
1035
1055
3. How to write and run process-level tests
1036
-
4. How to use snapshots to verify process outputs
1037
-
5. How to run all tests in a repository with a single command
1056
+
4. Two approaches to output validation:
1057
+
- Using snapshots for complete output verification
1058
+
- Using direct content assertions for specific content checks
1059
+
5. Best practices for test naming and organization
1060
+
6. How to run all tests in a repository with a single command
1038
1061
1039
1062
Testing is a critical part of pipeline development that helps ensure:
1040
1063
1041
1064
- Your code works as expected
1042
1065
- Changes don't break existing functionality
1043
1066
- Other developers can contribute with confidence
1044
1067
- Problems can be identified and fixed quickly
1068
+
- Output content matches expectations
1045
1069
1046
1070
### What's next?
1047
1071
@@ -1051,5 +1075,6 @@ Check out the [nf-test documentation](https://www.nf-test.com/) for more advance
1051
1075
- Write tests for edge cases and error conditions
1052
1076
- Set up continuous integration to run tests automatically
1053
1077
- Learn about other types of tests like workflow and module tests
1078
+
- Explore more advanced content validation techniques
1054
1079
1055
-
Remember: Tests are living documentation of how your code should behave. The more tests you write, the more confident you can be in your pipeline's reliability.
1080
+
Remember: Tests are living documentation of how your code should behave. The more tests you write, and the more specific your assertions are, the more confident you can be in your pipeline's reliability.
0 commit comments