|
| 1 | +User Tests: Basic API Level Tests |
| 2 | +============================================== |
| 3 | +This directory contains basic API level test applications for embARC MLI Library to check |
| 4 | +that all the functions available at the API level work in the way defined by the documentation. |
| 5 | +It is checked with basic error-detecting techniques like thresholds and cyclic redundancy check (CRC32). |
| 6 | + |
| 7 | +# Directory Structure |
| 8 | + |
| 9 | +`/user_tests/make` - contains application specific GNU make rules and settings. |
| 10 | +`/user_tests/test_components` - contains sources of various modules which are shared across tests. |
| 11 | +`/user_tests/tests` - contains subdirectories with sources and vectors for a test. |
| 12 | + |
| 13 | + |
| 14 | +# Building and Running |
| 15 | + |
| 16 | +You need to configure and build the library project for the desired platform. |
| 17 | +Please read the corresponding section on [building the package](/README.md#building-the-package). |
| 18 | +Also take a look at the [User Tests Specific options](#user-tests-specific-extra-options) that you may want to extend configuration with. |
| 19 | +There are no extra requirements specific for this application. All the specified platforms are supported by the test application. |
| 20 | + |
| 21 | +Build artifacts of the application are stored in the `/obj/<project>/user_tests` directory where `<project>` is defined according to your target platform. |
| 22 | + |
| 23 | +After you've built and configured the whole library project, you can proceed with the following steps. |
| 24 | +You need to replace `<options>` placeholder in commands below with the same options list you used for the library configuration and build. |
| 25 | + |
| 26 | +1. Open command line in the root of the embARC MLI repo and change working directory to './user_tests/make/' |
| 27 | + |
| 28 | + cd ./user_tests/make/ |
| 29 | + |
| 30 | +2. Clean previous build artifacts (optional). |
| 31 | + |
| 32 | + gmake <options> clean |
| 33 | + |
| 34 | +3. Build tests. Optional step as you may go to the next step which automatically invokes the build process. |
| 35 | + |
| 36 | + gmake <options> build |
| 37 | + |
| 38 | +4. Run all tests |
| 39 | + |
| 40 | + gmake <options> test_all |
| 41 | + |
| 42 | +Knowing a make target for a specific test, you can run it exclusively, skipping all the rest. |
| 43 | +To get the list of all available test targets use the following command: |
| 44 | + |
| 45 | + gmake get_tests_list |
| 46 | + |
| 47 | + |
| 48 | +## User Tests Specific Extra Options. |
| 49 | + |
| 50 | +There is only a `TEST_DEBUG` pre-processor define which can be passed with external C flags. |
| 51 | +To use it you need to extend your initial [library build command](/README.md#general-build-process) with `EXT_CFLAGS="-DTEST_DEBUG"`: |
| 52 | + |
| 53 | + gmake <target> <options> EXT_CFLAGS="-DTEST_DEBUG" |
| 54 | + |
| 55 | +This flag unblocks application specific assertions which may help in advanced debugging. |
| 56 | + |
| 57 | + |
| 58 | +## Expected output |
| 59 | + |
| 60 | +Test procedure includes running one or multiple (depending on used make target) test groups with generation of report tables. |
| 61 | +Each report table corresponds to a specific test group which is typically run by a separate executable file. |
| 62 | +For more information on the table content see the interface description of reporter modules in [`test_report.h` header file](/user_tests/test_components/test_report.h). |
| 63 | +You can also find more info on the quality metrics in [`test_quality_metrics.h` header file](/user_tests/test_components/test_quality_metrics.h). |
| 64 | + |
| 65 | +Content of the report table may differ depending on the platform and build options. |
| 66 | +In general, the following evidences indicate successful tests passing: |
| 67 | + |
| 68 | + - `Summary Status: PASSED` in the last line of the test report. |
| 69 | + - `PASSED` status for each line in the `Result` column of the test report table. |
| 70 | + - `SKIPPED` status together with descriptive message for a test is also an acceptable output. |
| 71 | + - `0x<CRC32 sum> (OK)` template for each line in the `CRC32 (Status)` column of the test report table. Some tests have only `Result` status and the whole column might be omitted. |
| 72 | + |
| 73 | +If you see `FAILED` in the `Result` column or `0x<CRC32 sum> (DIFF)` in the `CRC32 (Status)` column |
| 74 | +or `Summary Status: FAILED` in the last line of the table than something is wrong and tests have failed. |
| 75 | + |
| 76 | +If you use the `test_all` make target, then the first failed test group will halt the whole test procedure. |
| 77 | + |
| 78 | +# Test Steps of a User Test Application |
| 79 | + |
| 80 | +For most of the tests, input data and expected output data is stored as float values (`*.inc` file inside a specific test directory). |
| 81 | +Test application (`*.cc` file inside a specific test directory) typically does the following: |
| 82 | + |
| 83 | +1. Transforms source data to target data format for a specific test (quantize data). |
| 84 | +2. Applies test target function using transformed input operands. |
| 85 | +3. Transform kernel output data to float values (dequantize) and compares it against reference float data using various quality metrics. |
| 86 | +4. Calculate CRC from all input and output operands and compares it with hardcoded value to check that results are bit-exact with the expected ones. |
| 87 | +5. Form a report table. |
| 88 | + |
0 commit comments