You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[SPARK-53176][DEPLOY] Spark launcher should respect --load-spark-defaults
### What changes were proposed in this pull request?
SPARK-48392 introduces `--load-spark-defaults`, but does not apply correctly for the Spark launcher process, this mainly affects the driver when Spark runs in local/client mode.
let's say we have
```
$ cat > conf/spark-defaults.conf <<EOF
spark.driver.memory=4g
EOF
$ cat > conf/spark-local.conf <<EOF
spark.master=local[4]
EOF
```
```
$ bin/spark-shell --properties-file conf/spark-local.conf --load-spark-defaults
...
scala> spark.sql("SET spark.driver.memory").show()
+-------------------+-----+
| key|value|
+-------------------+-----+
|spark.driver.memory| 4g|
+-------------------+-----+
```
even the spark conf reports that driver uses 4GiB heap memory, but if we check the Java process, the config actually does not take effect, the default 1GiB is used instead.
```
$ jinfo <spark-submit-pid>
...
VM Arguments:
jvm_args: -Dscala.usejavacp=true -Xmx1g ...
```
### Why are the changes needed?
Bug fix.
### Does this PR introduce _any_ user-facing change?
Yes, bug fix.
### How was this patch tested?
UT is modified to cover the change, plus manual tests for the above cases.
```
$ jinfo <spark-submit-pid>
...
VM Arguments:
jvm_args: -Dscala.usejavacp=true -Xmx4g ...
```
### Was this patch authored or co-authored using generative AI tooling?
No.
Closes#51905 from pan3793/SPARK-53176.
Authored-by: Cheng Pan <[email protected]>
Signed-off-by: Liang-Chi Hsieh <[email protected]>
0 commit comments