You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/deferrable_functions_reference.rst
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -58,7 +58,7 @@ Currently ReFrame provides three broad categories of deferrable functions:
58
58
They include, but are not limited to, functions to iterate over regex matches in a file, extracting and converting values from regex matches, computing statistical information on series of data etc.
Copy file name to clipboardExpand all lines: docs/tutorial_basics.rst
+34-30Lines changed: 34 additions & 30 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -429,12 +429,13 @@ For running the benchmark, we need to set the OpenMP number of threads and pin t
429
429
You can set environment variables in a ReFrame test through the :attr:`~reframe.core.pipeline.RegressionTest.env_vars` dictionary.
430
430
431
431
What makes a ReFrame test a performance test is the definition of at least one :ref:`performance function<deferrable-performance-functions>`.
432
-
Similarly to a test's :func:`@sanity_function<reframe.core.pipeline.RegressionMixin.sanity_function>`, a performance function is a member function decorated with the :attr:`@performance_function<reframe.core.pipeline.RegressionMixin.performance_function>` decorator, which binds the decorated function to a given unit.
433
-
These functions can be used by the regression test to extract, measure or compute a given quantity of interest; where in this context, the values returned by a performance function are referred to as performance variables.
434
-
Alternatively, performance functions can also be thought as `tools` available to the regression test for extracting performance variables.
435
-
By default, ReFrame will attempt to execute all the available performance functions during the test's ``performance`` stage, producing a single performance variable out of each of the available performance functions.
436
-
These default-generated performance variables are defined in the regression test's attribute :attr:`~reframe.core.pipeline.RegressionTest.perf_variables` during class instantiation, and their default name matches the name of their associated performance function.
437
-
However, one could customize the default-generated performance variable's name by passing the ``perf-key`` argument to the :attr:`@performance_function<reframe.core.pipeline.RegressionMixin.performance_function>` decorator of the associated performance function.
432
+
Similarly to a test's :func:`@sanity_function<reframe.core.builtins.sanity_function>`, a performance function is a member function decorated with the :func:`@performance_function<reframe.core.builtins.performance_function>` decorator that merely extracts or computes a performance metric from the test's output and associates it with a unit.
433
+
By default, every performance function defined in the test is assigned to a *performance variable* with the function's name.
434
+
A performance variable is a named quantity representing a performance metric that ReFrame will report on, log and can also check against a reference value.
435
+
The performance variables of a test are stored in the :attr:`~reframe.core.pipeline.RegressionTest.perf_variables` dictionary.
436
+
The keys are the names of the metrics, whereas the values are :ref:`performance functions <deferrable-performance-functions>`.
437
+
The :func:`@performance_function<reframe.core.builtins.performance_function>` decorator apart from turning an ordinary method into a "performance function", it also creates an entry in the :attr:`~reframe.core.pipeline.RegressionTest.perf_variables` dictionary.
438
+
The optional ``perf_key`` argument can be used to assign a different name to the newly created performance variable.
438
439
439
440
In this example, we extract four performance variables, namely the memory bandwidth values for each of the "Copy", "Scale", "Add" and "Triad" sub-benchmarks of STREAM, where each of the performance functions use the :func:`~reframe.utility.sanity.extractsingle` utility function.
440
441
For each of the sub-benchmarks we extract the "Best Rate MB/s" column of the output (see below) and we convert that to a float.
@@ -468,9 +469,11 @@ This is especially useful if you run long suites of performance exploration test
468
469
Setting explicitly the test's performance variables
In the above STREAM example, all four performance functions were almost identical except for a small part of the regex pattern, which led to some code repetition.
472
-
Even though the performance functions were rather simple and the code repetition was not much in that case, this is still not a good practice and it is certainly an approach that would not scale when using more complex performance functions.
473
-
Hence, in this example, we show how to collapse all these four performance functions into a single function and how to reuse this single performance function to create multiple performance variables.
472
+
Users are allowed to manipulate the test's :attr:`~reframe.core.pipeline.RegressionTest.perf_variables` dictionary directly.
473
+
This is useful to avoid code repetition or in cases that relying on decorated methods to populate the :attr:`~reframe.core.pipeline.RegressionTest.perf_variables` is impractical, e.g., creating multiple performance variables in a loop.
474
+
475
+
You might have noticed that in our STREAM example above, all four performance functions are almost identical except for a small part of the regex pattern.
476
+
In the following example, we define a single performance function, :func:`extract_bw`, that can extract any of the requested bandwidth metrics, and we populate the :attr:`~reframe.core.pipeline.RegressionTest.perf_variables` ourselves in a pre-performance hook:
474
477
475
478
.. code-block:: console
476
479
@@ -480,28 +483,29 @@ Hence, in this example, we show how to collapse all these four performance funct
480
483
:start-at: import reframe
481
484
:emphasize-lines: 28-
482
485
483
-
As shown in the highlighted lines, this example collapses the four performance functions from the previous example into the :func:`extract_bw` function, which is also decorated with the :attr:`@performance_function<reframe.core.pipeline.RegressionMixin.performance_function>` decorator with the units set to ``'MB/s'``.
484
-
However, the :func:`extract_bw` function now takes the optional argument ``kind`` which selects the STREAM benchmark to extract.
485
-
By default, this argument is set to ``'Copy'`` because functions decorated with :attr:`@performance_function<reframe.core.pipeline.RegressionMixin.performance_function>` are only allowed to have ``self`` as a non-default argument.
486
-
Thus, from this performance function definition, ReFrame will default-generate a single performance variable during the test instantiation under the name ``extract_bw``, where this variable will report the performance results from the ``Copy`` benchmark.
487
-
With no further action from our side, ReFrame would just report the performance of the test based on this default-generated performance variable, but that is not what we are after here.
488
-
Therefore, we must modify these default performance variables so that this version of the STREAM test produces the same results as in the previous example.
489
-
As mentioned before, the performance variables (also the default-generated ones) are stored in the :attr:`~reframe.core.pipeline.RegressionTest.perf_variables` dictionary, so all we need to do is to redefine this mapping with our desired performance variables as done in the pre-performance pipeline hook :func:`set_perf_variables`.
486
+
As mentioned in the previous section the :func:`@performance_function <reframe.core.builtins.performance_function>` decorator performs two tasks:
490
487
491
-
.. tip::
492
-
Performance functions may also be generated inline using the :func:`~reframe.utility.sanity.make_performance_function` utility as shown below.
493
-
494
-
.. code-block:: python
495
-
496
-
@run_before('performance')
497
-
defset_perf_vars(self):
498
-
self.perf_variables = {
499
-
'Copy': sn.make_performance_function(
500
-
sn.extractsingle(r'Copy:\s+(\S+)\s+.*',
501
-
self.stdout, 1, float),
502
-
'MB/s'
503
-
)
504
-
}
488
+
1. It converts a test method to *performance function*, i.e., a function that is suitable for extracting a performance metric.
489
+
2. It updates the :attr:`~reframe.core.pipeline.RegressionTest.perf_variables` dictionary with the newly created performance function.
490
+
491
+
In this example, we are only interested in the first functionality and that's why we redefine completely the test's :attr:`~reframe.core.pipeline.RegressionTest.perf_variables` using the :func:`extract_bw` performance function.
492
+
If you are inheriting from a base test and you don't want to override completely its performance variables, you could call instead :py:func:`update` on :attr:`~reframe.core.pipeline.RegressionTest.perf_variables`.
493
+
494
+
Finally, you can convert any arbitrary function or :doc:`deferred expression <deferrable_functions_reference>` into a performance function by calling the :func:`~reframe.utility.sanity.make_performance_function` utility as shown below:
495
+
496
+
.. code-block:: python
497
+
498
+
@run_before('performance')
499
+
defset_perf_vars(self):
500
+
self.perf_variables = {
501
+
'Copy': sn.make_performance_function(
502
+
sn.extractsingle(r'Copy:\s+(\S+)\s+.*',
503
+
self.stdout, 1, float),
504
+
'MB/s'
505
+
)
506
+
}
507
+
508
+
Note that in this case, the newly created performance function is not assigned to a test's performance variable and you will have to do this independently.
0 commit comments