You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/hdinsight/r-server/r-server-hdinsight-manage.md
+3-6Lines changed: 3 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -8,7 +8,7 @@ ms.author: hrasheed
8
8
ms.reviewer: jasonh
9
9
ms.custom: hdinsightactive
10
10
ms.topic: conceptual
11
-
ms.date: 06/27/2018
11
+
ms.date: 11/06/2018
12
12
---
13
13
# Manage ML Services cluster on Azure HDInsight
14
14
@@ -74,7 +74,7 @@ Note also that the newly added users do not have root privileges in Linux system
74
74
75
75
## Connect remotely to Microsoft ML Services
76
76
77
-
You can set up access to the HDInsight Hadoop Spark compute context from a remote instance of ML Client running on your desktop. To do so, you must specify the options (hdfsShareDir, shareDir, sshUsername, sshHostname, sshSwitches, and sshProfileScript) when defining the RxSpark compute context on your desktop: For example:
77
+
You can set up access to the HDInsight Spark compute context from a remote instance of ML Client running on your desktop. To do so, you must specify the options (hdfsShareDir, shareDir, sshUsername, sshHostname, sshSwitches, and sshProfileScript) when defining the RxSpark compute context on your desktop: For example:
78
78
79
79
myNameNode <- "default"
80
80
myPort <- 0
@@ -220,16 +220,13 @@ A compute context allows you to control whether computation is performed locally
220
220
summary(modelSpark)
221
221
222
222
223
-
> [!NOTE]
224
-
> You can also use MapReduce to distribute computation across cluster nodes. For more information on compute context, see [Compute context options for ML Services cluster on HDInsight](r-server-compute-contexts.md).
225
-
226
223
## Distribute R code to multiple nodes
227
224
228
225
With ML Services on HDInsight, you can take existing R code and run it across multiple nodes in the cluster by using `rxExec`. This function is useful when doing a parameter sweep or simulations. The following code is an example of how to use `rxExec`:
If you are still using the Spark or MapReduce context, this command returns the nodename value for the worker nodes that the code `(Sys.info()["nodename"])` is run on. For example, on a four node cluster, you expect to receive output similar to the following snippet:
229
+
If you are still using the Spark context, this command returns the nodename value for the worker nodes that the code `(Sys.info()["nodename"])` is run on. For example, on a four node cluster, you expect to receive output similar to the following snippet:
0 commit comments