Skip to content

Commit b1e1744

Browse files
committed
Make some corrections re context-per-jvm in README
1 parent 80ca72c commit b1e1744

File tree

1 file changed

+4
-3
lines changed

1 file changed

+4
-3
lines changed

README.md

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -121,7 +121,7 @@ Alternatives:
121121
* EC2 Deploy scripts - follow the instructions in [EC2](doc/EC2.md) to spin up a Spark cluster with job server and an example application.
122122
* EMR Deploy instruction - follow the instruction in [EMR](doc/EMR.md)
123123

124-
NOTE: Spark Job Server can optionally run `SparkContext`s in their own, forked JVM process when the config option `spark.jobserver.context-per-jvm` is set to `true`. In local development mode, this is set to false by default, while the deployment templates have this set to true for production deployment. See [Deployment](#deployment) section for more info.
124+
NOTE: Spark Job Server can optionally run `SparkContext`s in their own, forked JVM process when the config option `spark.jobserver.context-per-jvm` is set to `true`. This option does not currently work for SBT/local dev mode. See [Deployment](#deployment) section for more info.
125125

126126
## Development mode
127127

@@ -452,8 +452,9 @@ NOTE: by default the assembly jar from `job-server-extras`, which includes suppo
452452
453453
### Context per JVM
454454
455-
NOTE: Each context can be a separate process launched using spark-submit, via the included `manager_start.sh` script, if `context-per-jvm` is set to true.
456-
You may want to set `deploy.manager-start-cmd` to the correct path to your start script and customize the script.
455+
Each context can be a separate process launched using spark-submit, via the included `manager_start.sh` script, if `context-per-jvm` is set to true.
456+
You may want to set `deploy.manager-start-cmd` to the correct path to your start script and customize the script. This can be especially desirable when you want to run many contexts at once, or for certain types of contexts such as StreamingContexts which really need their own processes.
457+
457458
Also, the extra processes talk to the master HTTP process via random ports using the Akka Cluster gossip protocol. If for some reason the separate processes causes issues, set `spark.jobserver.context-per-jvm` to `false`, which will cause the job server to use a single JVM for all contexts.
458459
459460
Among the known issues:

0 commit comments

Comments
 (0)