Skip to content
This repository was archived by the owner on Jan 9, 2020. It is now read-only.

Commit d3deeb3

Browse files
wangyumsrowen
authored andcommitted
[MINOR][DOCS] Improve Running R Tests docs
## What changes were proposed in this pull request? Update Running R Tests dependence packages to: ```bash R -e "install.packages(c('knitr', 'rmarkdown', 'testthat', 'e1071', 'survival'), repos='http://cran.us.r-project.org')" ``` ## How was this patch tested? manual tests Author: Yuming Wang <[email protected]> Closes apache#18271 from wangyum/building-spark. (cherry picked from commit 45824fb) Signed-off-by: Sean Owen <[email protected]>
1 parent 653e6f1 commit d3deeb3

File tree

3 files changed

+7
-10
lines changed

3 files changed

+7
-10
lines changed

R/README.md

Lines changed: 1 addition & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -66,11 +66,7 @@ To run one of them, use `./bin/spark-submit <filename> <args>`. For example:
6666
```bash
6767
./bin/spark-submit examples/src/main/r/dataframe.R
6868
```
69-
You can also run the unit tests for SparkR by running. You need to install the [testthat](http://cran.r-project.org/web/packages/testthat/index.html) package first:
70-
```bash
71-
R -e 'install.packages("testthat", repos="http://cran.us.r-project.org")'
72-
./R/run-tests.sh
73-
```
69+
You can run R unit tests by following the instructions under [Running R Tests](http://spark.apache.org/docs/latest/building-spark.html#running-r-tests).
7470

7571
### Running on YARN
7672

R/WINDOWS.md

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -34,10 +34,9 @@ To run the SparkR unit tests on Windows, the following steps are required —ass
3434

3535
4. Set the environment variable `HADOOP_HOME` to the full path to the newly created `hadoop` directory.
3636

37-
5. Run unit tests for SparkR by running the command below. You need to install the [testthat](http://cran.r-project.org/web/packages/testthat/index.html) package first:
37+
5. Run unit tests for SparkR by running the command below. You need to install the needed packages following the instructions under [Running R Tests](http://spark.apache.org/docs/latest/building-spark.html#running-r-tests) first:
3838

3939
```
40-
R -e "install.packages('testthat', repos='http://cran.us.r-project.org')"
4140
.\bin\spark-submit2.cmd --conf spark.hadoop.fs.defaultFS="file:///" R\pkg\tests\run-all.R
4241
```
4342

docs/building-spark.md

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -218,9 +218,11 @@ The run-tests script also can be limited to a specific Python version or a speci
218218

219219
## Running R Tests
220220

221-
To run the SparkR tests you will need to install the R package `testthat`
222-
(run `install.packages(testthat)` from R shell). You can run just the SparkR tests using
223-
the command:
221+
To run the SparkR tests you will need to install the [knitr](https://cran.r-project.org/package=knitr), [rmarkdown](https://cran.r-project.org/package=rmarkdown), [testthat](https://cran.r-project.org/package=testthat), [e1071](https://cran.r-project.org/package=e1071) and [survival](https://cran.r-project.org/package=survival) packages first:
222+
223+
R -e "install.packages(c('knitr', 'rmarkdown', 'testthat', 'e1071', 'survival'), repos='http://cran.us.r-project.org')"
224+
225+
You can run just the SparkR tests using the command:
224226

225227
./R/run-tests.sh
226228

0 commit comments

Comments
 (0)