Skip to content

Commit 286ec60

Browse files
committed
Reformat streamingContext docs / config options
1 parent 335b217 commit 286ec60

File tree

3 files changed

+19
-15
lines changed

3 files changed

+19
-15
lines changed

doc/contexts.md

Lines changed: 7 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -40,9 +40,11 @@ This can be done easily by extending the `SparkContextFactory` trait, like `SQLC
4040

4141
If you wish to use the `SQLContext` or `HiveContext`, be sure to pull down the job-server-extras package.
4242

43-
# StreamingContext
44-
job-server-extras provides a context to run Spark Streaming jobs, there are a couple of configurations you can change in job-server's .conf file
45-
streaming.batch_interval: the streaming batch in millis
46-
streaming.stopGracefully: if true, stops gracefully by waiting for the processing of all received data to be completed
47-
streaming.stopSparkContext: if true, stops the SparkContext with the StreamingContext. The underlying SparkContext will be stopped regardless of whether the StreamingContext has been started.
43+
## StreamingContext
44+
45+
`job-server-extras` provides a context to run Spark Streaming jobs. There are a couple of configurations you can change in job-server's .conf file:
46+
47+
* `streaming.batch_interval`: the streaming batch in millis
48+
* `streaming.stopGracefully`: if true, stops gracefully by waiting for the processing of all received data to be completed
49+
* `streaming.stopSparkContext`: if true, stops the SparkContext with the StreamingContext. The underlying SparkContext will be stopped regardless of whether the StreamingContext has been started.
4850

job-server-extras/src/spark.jobserver/StreamingTestJob.scala

Lines changed: 0 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -7,9 +7,6 @@ import org.apache.spark.streaming.StreamingContext
77

88
import scala.collection.mutable
99

10-
/** :: TestObject ::
11-
* A Streaming job for testing, will
12-
*/
1310
@VisibleForTesting
1411
object StreamingTestJob extends SparkStramingJob {
1512
def validate(ssc: StreamingContext, config: Config): SparkJobValidation = SparkJobValid

job-server/src/main/resources/application.conf

Lines changed: 12 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -54,13 +54,18 @@ spark {
5454
# Determines the type of jobs that can run in a SparkContext
5555
context-factory = spark.jobserver.context.DefaultSparkContextFactory
5656

57-
# Default batch interval for Spark Streaming contexts in milliseconds
58-
streaming.batch_interval = 1000
59-
# if true, stops gracefully by waiting for the processing of all received data to be completed
60-
streaming.stopGracefully = true
61-
# if true, stops the SparkContext with the StreamingContext. The underlying SparkContext will be
62-
# stopped regardless of whether the StreamingContext has been started.
63-
streaming.stopSparkContext = true
57+
58+
streaming {
59+
# Default batch interval for Spark Streaming contexts in milliseconds
60+
batch_interval = 1000
61+
62+
# if true, stops gracefully by waiting for the processing of all received data to be completed
63+
stopGracefully = true
64+
65+
# if true, stops the SparkContext with the StreamingContext. The underlying SparkContext will be
66+
# stopped regardless of whether the StreamingContext has been started.
67+
stopSparkContext = true
68+
}
6469

6570
# uris of jars to be loaded into the classpath for this context. Uris is a string list, or a string separated by commas ','
6671
# dependent-jar-uris = ["file:///some/path/present/in/each/mesos/slave/somepackage.jar"]

0 commit comments

Comments
 (0)