-
Notifications
You must be signed in to change notification settings - Fork 279
Description
Our current approach to registering tests in python presents lots of opportunities for failure. I'm opening this issue to discuss improvements to this system.
Issue
Writing a python test right now requires the author to correctly follow all of these steps.
- Derive the class containing the test from
KratosMultiphysics.KratosUnittest.TestCaseorunittest.TestCase. - Name test functions in a manner that
TestCaseexpects (test_*or*_test). - Import the module containing the tests in the application's test runner script.
- Register all tests from the imported module in the correct suite (
smallornightly) in the test runner class. - Repeat the last two steps for tests meant to execute in
MPIas well.
Missing any of the steps above means that, even if the author of a PR writes new tests, they won't ever get executed. Reviewers will probably be happy seeing that new features are tested, while new bugs and nonsense could actually be inserted into the repository.
I recommend simplifying the system such that
- writing and registering requires less steps (ideally just writing/modifying a single script),
- no test can go undetected,
- reviewers can easily tell whether new features are tested,
- unused files in test directories are detected,
- unused functions/objects in test scripts are detected.
Example
While adapting existing tests with modified configuration, I ran into a couple of test cases that have probably never been run before. For example, take the "sprism_test" from structural mechanics. It has the following bits in its configuration file.
Kratos/applications/StructuralMechanicsApplication/tests/sprism_test/pan_test_parameters.json
Lines 46 to 52 in 8bc8bcb
| "amp": 1.618, | |
| "etmxa": 5, | |
| "etmna": 0.1 | |
| }, | |
| }, | |
| "processes" : { | |
| "constraints_process_list" : [ |
Note the trailing comma in line 49, which makes it a malformed JSON file that nlohmann::json (our JSON parser behind Parameters) rejects. The CI currently not failing means this file is never parsed, thus the test never run.
Metadata
Metadata
Type
Projects
Status