Skip to content

Commit 8a9eb05

Browse files
HeartSaVioRMarcelo Vanzin
authored andcommitted
[SPARK-26606][CORE] Handle driver options properly when submitting to standalone cluster mode via legacy Client
## What changes were proposed in this pull request? This patch fixes the issue that ClientEndpoint in standalone cluster doesn't recognize about driver options which are passed to SparkConf instead of system properties. When `Client` is executed via cli they should be provided as system properties, but with `spark-submit` they can be provided as SparkConf. (SpartSubmit will call `ClientApp.start` with SparkConf which would contain these options.) ## How was this patch tested? Manually tested via following steps: 1) setup standalone cluster (launch master and worker via `./sbin/start-all.sh`) 2) submit one of example app with standalone cluster mode ``` ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master "spark://localhost:7077" --conf "spark.driver.extraJavaOptions=-Dfoo=BAR" --deploy-mode "cluster" --num-executors 1 --driver-memory 512m --executor-memory 512m --executor-cores 1 examples/jars/spark-examples*.jar 10 ``` 3) check whether `foo=BAR` is provided in system properties in Spark UI <img width="877" alt="Screen Shot 2019-03-21 at 8 18 04 AM" src="https://user-images.githubusercontent.com/1317309/54728501-97db1700-4bc1-11e9-89da-078445c71e9b.png"> Closes apache#24163 from HeartSaVioR/SPARK-26606. Authored-by: Jungtaek Lim (HeartSaVioR) <[email protected]> Signed-off-by: Marcelo Vanzin <[email protected]>
1 parent 34e3cc7 commit 8a9eb05

File tree

1 file changed

+8
-3
lines changed

1 file changed

+8
-3
lines changed

core/src/main/scala/org/apache/spark/deploy/Client.scala

Lines changed: 8 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -61,6 +61,10 @@ private class ClientEndpoint(
6161
private val lostMasters = new HashSet[RpcAddress]
6262
private var activeMasterEndpoint: RpcEndpointRef = null
6363

64+
private def getProperty(key: String, conf: SparkConf): Option[String] = {
65+
sys.props.get(key).orElse(conf.getOption(key))
66+
}
67+
6468
override def onStart(): Unit = {
6569
driverArgs.cmd match {
6670
case "launch" =>
@@ -70,18 +74,19 @@ private class ClientEndpoint(
7074
val mainClass = "org.apache.spark.deploy.worker.DriverWrapper"
7175

7276
val classPathConf = config.DRIVER_CLASS_PATH.key
73-
val classPathEntries = sys.props.get(classPathConf).toSeq.flatMap { cp =>
77+
val classPathEntries = getProperty(classPathConf, conf).toSeq.flatMap { cp =>
7478
cp.split(java.io.File.pathSeparator)
7579
}
7680

7781
val libraryPathConf = config.DRIVER_LIBRARY_PATH.key
78-
val libraryPathEntries = sys.props.get(libraryPathConf).toSeq.flatMap { cp =>
82+
val libraryPathEntries = getProperty(libraryPathConf, conf).toSeq.flatMap { cp =>
7983
cp.split(java.io.File.pathSeparator)
8084
}
8185

8286
val extraJavaOptsConf = config.DRIVER_JAVA_OPTIONS.key
83-
val extraJavaOpts = sys.props.get(extraJavaOptsConf)
87+
val extraJavaOpts = getProperty(extraJavaOptsConf, conf)
8488
.map(Utils.splitCommandString).getOrElse(Seq.empty)
89+
8590
val sparkJavaOpts = Utils.sparkJavaOpts(conf)
8691
val javaOpts = sparkJavaOpts ++ extraJavaOpts
8792
val command = new Command(mainClass,

0 commit comments

Comments
 (0)