Skip to content

Commit 4fbbe6a

Browse files
committed
Added guide for writing and running tests for OpenSCAP
1 parent 4a156f3 commit 4fbbe6a

File tree

2 files changed

+331
-0
lines changed

2 files changed

+331
-0
lines changed

docs/contribute/contribute.adoc

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -101,6 +101,18 @@ if your changes will be applicable and won't be in a conflict with some work tha
101101
other contributors could have published while you were working on the fix.
102102

103103

104+
== Optional: Write an automated test for your code
105+
There is a big chance that the code you've fixed or the code that you've added
106+
has no test coverage. We encourage you to write a new test or extend the
107+
existing one to cover the changes/additions to OpenSCAP that you have made.
108+
It is not mandatory, so reviewer will not require you to write a test, but keep
109+
in mind that providing a test might uncover some unexpected issues with the
110+
code. We also run these tests on every pull request so adding a test for your
111+
fix or new feature might prevent someone from breaking it in the future. If you
112+
decided to add a test please see our
113+
link:testing.adoc[guide for writing and running tests].
114+
115+
104116
== Rebase before pull request
105117
If some other contributor pushed some code to the `maint-1.2` branch while you
106118
were working on the fix and if there would be a conflict between your changes

docs/contribute/testing.adoc

Lines changed: 319 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,319 @@
1+
= Writing and running tests for OpenSCAP
2+
3+
This document should help you with writing and running tests for your pull
4+
requests. It is recommended to add a new test if a new functionality is added
5+
or if a code that is not covered by any test is updated. Another recommendation
6+
is to add your test in a separate commit.
7+
8+
All the tests reside in the link:../../tests[tests] directory. It has multiple
9+
subdirectories which should represent various parts of the OpenSCAP library and
10+
its utilities. When you contribute to some part of the OpenSCAP project you
11+
should put a test for your contribution into the corresponding subdirectory
12+
in the link:../../tests[tests] directory. Use your best judgement when deciding
13+
where to put the test for your pull request and if you are not sure don't be
14+
affraid to ask in the pull request, someone will definitely help you with that.
15+
16+
NOTE: OpenSCAP project uses the **GNU automake** which has built-in
17+
link:https://www.gnu.org/software/automake/manual/html_node/Tests.html[support
18+
for test suites].
19+
20+
21+
== Preparing test environment and running tests
22+
To run a specific test or all tests you first need to compile the OpenSCAP
23+
library and then install additional packages required for testing. See the
24+
*Compilation* section in the link:../../README.md[README.md] for more details.
25+
26+
27+
== Writing a new test
28+
In this guide we will use an example to describe the process of writing a test
29+
for the OpenSCAP project. Let's suppose you want to write a new test for
30+
the Script Check Engine (SCE) to test its basic functionality. SCE allows you
31+
to define your own scripts (usually written in Bash or Python) to extend XCCDF
32+
rule checking capabilities. Custom check scripts can be referenced from
33+
an XCCDF rule using `<check-content-ref>` element, for example:
34+
[[app-listing]]
35+
[subs=+quotes]
36+
----
37+
<check system="http://open-scap.org/page/SCE">
38+
<check-content-ref href="YOUR_BASH_SCRIPT.sh"/>
39+
</check>
40+
----
41+
42+
43+
=== Deciding where to put a new test
44+
In our example, we are testing the SCE module, therefore we will look for
45+
its subdirectory in the link:../../tests[tests] direcotory and we will find it
46+
at the following link: link:../../tests/sce[tests/sce]. We will add our new test
47+
into this subdirectory.
48+
49+
50+
==== Scenario A: There is a suitable directory for my new test
51+
This will happen most of the times. As stated above in our example we will place
52+
our test into the link:../../tests/sce[tests/sce] directory.
53+
54+
55+
==== Scenario B: There is no suitable directory for my new test
56+
This might happen if your test covers a part of the OpenSCAP which has no tests
57+
at all. In this case you would need to add a new directory with suitable name
58+
into the link:../../tests[tests] directory structure and create/update
59+
Makefiles.
60+
61+
To have an example also for this scenario, let's suppose we want to add the
62+
`foo` subdirectory into the link:../../tests[tests] directory. We need to:
63+
64+
. Create the `foo` subdirectory in the link:../../tests[tests] directory.
65+
. Add a new entry for `foo` subdirectory into the `SUBDIRS` variable in
66+
the link:../../tests/Makefile.am[tests/Makefile.am].
67+
. Create `Makefile.am` inside the `tests/foo` directory and list all
68+
your test scripts inside that directory in the `TESTS` variable.
69+
70+
Please see the GNU automake
71+
link:https://www.gnu.org/software/automake/manual/html_node/Tests.html[
72+
documentation about test suites] for more details.
73+
74+
75+
=== Common test library
76+
When writing tests for OpenSCAP you should use the common test library which is
77+
located at link:../../tests/test_common.sh.in[tests/test_common.sh.in].
78+
79+
NOTE: The `configure` script will generate the `tests/test_common.sh` file from
80+
the link:../../tests/test_common.sh.in[tests/test_common.sh.in] adding
81+
configuration specific settings.
82+
83+
You will need to source the `tests/test_common.sh` in your test scripts to use
84+
the functions which it provides. When sourcing the file, you need to specify
85+
a relative path to it from the directory with your test script. For example,
86+
when your test script will be in the `tests/foo` directory:
87+
[[app-listing]]
88+
[source,bash]
89+
----
90+
#!/usr/bin/env bash
91+
92+
set -o pipefail
93+
94+
. ../test_common.sh
95+
96+
test1
97+
test2
98+
...
99+
----
100+
101+
102+
==== Global variables exported by test library
103+
Always use `$OSCAP` variable instead of the plain `oscap` in your tests when
104+
calling the `oscap` command line tool. This is because tests might be run with
105+
`CUSTOM_OSCAP` variable.
106+
107+
You can use `$XMLDIFF` in your tests which will call the
108+
link:../../tests/xmldiff.pl[tests/xmldiff.pl] script.
109+
110+
It is also possible to do XPath queries using `$XPATH` variable, the usage is:
111+
[[app-listing]]
112+
[source,bash]
113+
----
114+
$XPATH FILE 'QUERY'
115+
----
116+
117+
118+
==== Best practices when writing bash tests
119+
It is always good to set `pipefail` option for all your test scripts so you
120+
accidentally don't lose non-zero exit statuses of commands in a pipeline:
121+
[[app-listing]]
122+
[source,bash]
123+
----
124+
set -o pipefail
125+
----
126+
127+
The next option you can consider is `errexit` which exits your script
128+
immediately if a command exits with a non-zero status. You might want to use
129+
this option if tests are somehow dependent and it doesn't make sense to continue
130+
testing after one test fails.
131+
[[app-listing]]
132+
[source,bash]
133+
----
134+
set -e
135+
----
136+
137+
138+
==== test_init function
139+
The function expects that you provide it with a name of a test log file.
140+
All tests which will be run after `test_init` using `test_run` function
141+
will report results into this log file:
142+
[[app-listing]]
143+
[source,bash]
144+
----
145+
test_init test_foo.log
146+
----
147+
148+
NOTE: The specified log file will contain output of both `stdout` and `stderr`
149+
of every test executed by the `test_run` function.
150+
151+
152+
==== test_run function
153+
The function is responsible for executing a test script file or a function and
154+
logging its result into the log file created by the `test_init` function.
155+
[[app-listing]]
156+
[source,bash]
157+
----
158+
test_run "DESCRIPTION" TEST_FUNCTION|$srcdir/TEST_SCRIPT_FILE ARG [ARG...]
159+
----
160+
The `$srcdir` variable contains the path to the directory with the test script.
161+
The `test_run` function reports the following results into the log file:
162+
163+
* *PASS* when script/function returns *0*,
164+
* *FAIL* when script/function returns *1*,
165+
* *SKIP* when script/function returns *255*,
166+
* *WARN* when script/function returns none of the above exit statuses.
167+
168+
The result of every test executed by the `test_run` function will be reported
169+
in the log file in a following way:
170+
[[app-listing]]
171+
[source,bash]
172+
----
173+
TEST: DESCRIPTION
174+
<test stdout + stderr output>
175+
RESULT: PASS/FAIL/SKIP/WARN
176+
----
177+
178+
179+
==== test_exit function
180+
NOTE: This function should be called after the `test_init` function.
181+
182+
The function is responsible for cleaning-up the testing environment. You can
183+
call it without arguments or with one argument -- a script/function which will
184+
do additional clean-up tasks.
185+
[[app-listing]]
186+
[source,bash]
187+
----
188+
test_exit [CLEAN_SCRIPT|CLEAN_FUNCTION]
189+
----
190+
191+
192+
==== require function
193+
Checks if requirements are in the `$PATH`, use it as follows:
194+
[[app-listing]]
195+
[source,bash]
196+
----
197+
require 'program' || return 255
198+
----
199+
200+
201+
==== probecheck function
202+
Checks if probe exists, use it as follows:
203+
[[app-listing]]
204+
[source,bash]
205+
----
206+
probecheck 'probe' || return 255
207+
----
208+
209+
210+
==== verify_results function
211+
Verifies that there is the `COUNT` number of results of selected OVAL `TYPE` in
212+
a `RESULTS_FILE`:
213+
[[app-listing]]
214+
[source,bash]
215+
----
216+
verify_results TYPE CONTENT_FILE RESULTS_FILE COUNT
217+
----
218+
219+
NOTE: This function expects that the OVAL `TYPE` is numbered from `1` to `COUNT`
220+
in the `RESULTS_FILE`.
221+
222+
223+
==== assert_exists function
224+
Does an XPath query to a file specified in the `$result` variable and checks if
225+
number of results matches with an expected number specified as an argument:
226+
[[app-listing]]
227+
[source,bash]
228+
----
229+
result="relative_path_to_file"
230+
assert_exists EXPECTED_NUMBER_OF_RESULTS XPATH_QUERY_STRING
231+
----
232+
233+
For example, let's say you want to check that in the `results.xml` file the
234+
result of the rule `xccdf_com.example.www_rule_test` is fail:
235+
[[app-listing]]
236+
[source,bash]
237+
----
238+
result="./results.xml"
239+
my_rule_="xccdf_com.example.www_rule_test"
240+
assert_exists 1 "//rule-result[@idref=\"$my_rule\"]/result[text()=\"fail\"]"
241+
----
242+
243+
244+
=== Adding test files
245+
Now, as we know where a new test should go and what functions and capabilities
246+
are provided by the common test library, we can add test files which will
247+
contain test scripts and content required for testing.
248+
249+
To sum up, we are adding a tests to check the basic functionality of the Script
250+
Check Engine (SCE) and we have decided that the test will go into the
251+
link:../../tests/sce[tests/sce] directory.
252+
253+
We will add the link:../../tests/sce/test_sce.sh[tests/sce/test_sce.sh]
254+
script which will contain our test and
255+
link:../../tests/sce/sce_xccdf.xml[tests/sce/sce_xccdf.xml], an XML file with
256+
XCCDF rules which are referencing various check scripts (grep the
257+
`check-content-ref` element to see the referenced files). All the referenced
258+
check script files are set to always pass and the
259+
link:../../tests/sce/test_sce.sh[tests/sce/test_sce.sh] script will perform
260+
evaluation of the link:../../tests/sce/sce_xccdf.xml[tests/sce/sce_xccdf.xml]
261+
XCCDF document file and it will check that all rule results are `pass`.
262+
263+
264+
=== Plugging your new test into the test library
265+
You need to plug your test into the test library so it will be run automatically
266+
everytime `make check` is run. To do this, you need to add all the test files
267+
into the `Makefile.am`. The `Makefile.am` which you need to modify is located
268+
in the same directory as your test files.
269+
270+
We will demonstrate this on our example with the SCE test. We have prepared our
271+
test script, the XML document file with custom rules and various check scripts
272+
for testing. We placed all our test files into the
273+
link:../../tests/sce[tests/sce] directory. Now we will modify the
274+
link:../../tests/sce/Makefile.am[tests/sce/Makefile.am] and we will add all our
275+
test files into the `EXTRA_DIST` variable and our test script into the `TESTS`
276+
variable which will make sure that our test will be executed by the `make check`:
277+
[[app-listing]]
278+
[subs=+quotes]
279+
----
280+
...
281+
282+
TESTS = *test_sce.sh* \
283+
...
284+
285+
...
286+
287+
EXTRA_DIST = *test_sce.sh \
288+
sce_xccdf.xml \
289+
bash_passer.sh \
290+
python_passer.py \
291+
python_is16.py \
292+
lua_passer.lua \*
293+
...
294+
----
295+
296+
297+
=== Running your new test
298+
To run your new test you first need to compile the OpenSCAP library. See the
299+
*Compilation* section in the link:../../README.md[README.md] for more details.
300+
Also you don't need to run all the tests, only tests in a directory where you
301+
have added your new test. To do so, first change directory to the directory
302+
with your test and then run `make check` from there, for example:
303+
[[app-listing]]
304+
[source,bash]
305+
----
306+
$ cd tests/sce
307+
$ make check
308+
----
309+
310+
Log file with your test results can be located under the name which is set
311+
using the `test_init` function in your test script. In our example,
312+
in the link:../../tests/sce/test_sce.sh[tests/sce/test_sce.sh] test script,
313+
the log file is set as `test_sce.log`.
314+
315+
NOTE: Our Jenkins runs `make distcheck` instead of `make check` so please make
316+
sure that your test works for both of these modes. More details about the GNU
317+
Build System build tree and source tree can be found in the
318+
link:https://www.gnu.org/software/automake/manual/html_node/Checking-the-Distribution.html[GNU automake documentation].
319+

0 commit comments

Comments
 (0)