You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
-[Package Jar - Send to Server](#package-jar---send-to-server)
24
24
-[Ad-hoc Mode - Single, Unrelated Jobs (Transient Context)](#ad-hoc-mode---single-unrelated-jobs-transient-context)
25
-
-[Persistent Context Mode - Faster & Required for Related Jobs](#persistent-context-mode---faster-&-required-for-related-jobs)
25
+
-[Persistent Context Mode - Faster & Required for Related Jobs](#persistent-context-mode---faster--required-for-related-jobs)
26
+
-[Debug mode](#debug-mode)
26
27
-[Create a Job Server Project](#create-a-job-server-project)
27
28
-[Creating a project from scratch using giter8 template](#creating-a-project-from-scratch-using-giter8-template)
28
29
-[Creating a project manually assuming that you already have sbt project structure](#creating-a-project-manually-assuming-that-you-already-have-sbt-project-structure)
@@ -40,7 +41,8 @@ Also see [Chinese docs / 中文](doc/chinese/job-server.md).
40
41
-[Chef](#chef)
41
42
-[Architecture](#architecture)
42
43
-[API](#api)
43
-
-[Jars](#jars)
44
+
-[Binaries](#binaries)
45
+
-[Jars (deprecated)](#jars-deprecated)
44
46
-[Contexts](#contexts)
45
47
-[Jobs](#jobs)
46
48
-[Data](#data)
@@ -242,6 +244,44 @@ Now let's run the job in the context and get the results back right away:
242
244
243
245
Note the addition of `context=` and `sync=true`.
244
246
247
+
### Debug mode
248
+
Spark job server is started using SBT Revolver (which forks a new JVM), so debugging directly in an IDE is not feasible.
249
+
To enable debugging, the Spark job server should be started from the SBT shell with the following Java options :
The above command starts a remote debugging server on port 15000. The Spark job server is not started until a debugging client
254
+
(Intellij, Eclipse, telnet, ...) connects to the exposed port.
255
+
256
+
In your IDE you just have to start a Remote debugging debug job and use the above defined port. Once the client connects to the debugging server the Spark job server is started and you can start adding breakpoints and debugging requests.
257
+
258
+
Note that you might need to adjust some server parameters to avoid short Spary/Akka/Spark timeouts, in your `dev.conf` add the following values :
259
+
```bash
260
+
spark {
261
+
jobserver {
262
+
# Dev debug timeouts
263
+
context-creation-timeout = 1000000 s
264
+
yarn-context-creation-timeout = 1000000 s
265
+
default-sync-timeout = 1000000 s
266
+
}
267
+
268
+
context-settings {
269
+
# Dev debug timeout
270
+
context-init-timeout = 1000000 s
271
+
}
272
+
}
273
+
spray.can.server {
274
+
# Debug timeouts
275
+
idle-timeout = infinite
276
+
request-timeout = infinite
277
+
}
278
+
```
279
+
280
+
Additionally, you might have to increase the Akka Timeouts by adding the following query parameter `timeout=1000000` in your HTTP requests :
281
+
```bash
282
+
curl -d "input.string = a b c a b see""localhost:8090/jobs?appName=test&classPath=spark.jobserver.WordCountExample&sync=true&timeout=100000"
283
+
```
284
+
245
285
## Create a Job Server Project
246
286
### Creating a project from scratch using giter8 template
0 commit comments