Skip to content

Commit a91a778

Browse files
author
Vasileios Karakasis
committed
Minor style enhancements and fixes
Also moved Spark library test under a new folder.
1 parent 443226f commit a91a778

File tree

3 files changed

+11
-9
lines changed

3 files changed

+11
-9
lines changed

cscs-checks/apps/spark/spark_check.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,10 +5,11 @@
55

66
import reframe as rfm
77

8-
from hpctestlib.apps.spark.compute_pi import compute_pi
8+
from hpctestlib.data_analytics.spark.compute_pi import compute_pi_check
9+
910

1011
@rfm.simple_test
11-
class cscs_compute_pi_check(compute_pi):
12+
class cscs_compute_pi_check(compute_pi_check):
1213
valid_systems = ['daint:gpu', 'daint:mc', 'dom:gpu', 'dom:mc']
1314
valid_prog_environs = ['builtin']
1415
modules = ['Spark']

hpctestlib/apps/spark/compute_pi/__init__.py renamed to hpctestlib/data_analytics/spark/compute_pi/__init__.py

Lines changed: 8 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@
1313

1414

1515
@rfm.simple_test
16-
class compute_pi(rfm.RunOnlyRegressionTest, pin_prefix=True):
16+
class compute_pi_check(rfm.RunOnlyRegressionTest, pin_prefix=True):
1717
'''Test Apache Spark by computing PI.
1818
1919
Apache Spark is a unified analytics engine for large-scale data
@@ -24,12 +24,13 @@ class compute_pi(rfm.RunOnlyRegressionTest, pin_prefix=True):
2424
learning, GraphX for graph processing, and Structured Streaming for
2525
incremental computation and stream processing (see spark.apache.org).
2626
27-
The present class check that Spark is functioning correctly.
28-
To do this, it is necessary to define the tolerance of acceptable
29-
deviation. The tolerance is used to check that the computations is
30-
executed correctly, by comparing the value of PI calculated to the one
31-
obtained from the math library. The default assumption is that Spark is
32-
already installed on the system under test.
27+
This test checks that Spark is functioning correctly. To do this, it is
28+
necessary to define the tolerance of acceptable deviation. The tolerance
29+
is used to check that the computations are executed correctly, by
30+
comparing the value of pi calculated to the one obtained from the math
31+
library. The default assumption is that Spark is already installed on the
32+
system under test.
33+
3334
'''
3435

3536
#: Parameter encoding the variant of the test.

0 commit comments

Comments
 (0)