Skip to content

Commit 35700bb

Browse files
Hieu Huynhtgravescs
authored andcommitted
[SPARK-24981][CORE] ShutdownHook timeout causes job to fail when succeeded when SparkContext stop() not called by user program
**Description** The issue is described in [SPARK-24981](https://issues.apache.org/jira/browse/SPARK-24981). **How does this PR fix the issue?** This PR catch the Exception that is thrown while the Sparkcontext.stop() is running (when it is called by the ShutdownHookManager). **How was this patch tested?** I manually tested it by adding delay (60s) inside the stop(). This make the shutdownHookManger interrupt the thread that is running stop(). The Interrupted Exception was catched and the job succeed. Author: Hieu Huynh <“[email protected]”> Author: Hieu Tri Huynh <[email protected]> Closes apache#21936 from hthuynh2/SPARK_24981.
1 parent c1760da commit 35700bb

File tree

1 file changed

+6
-1
lines changed

1 file changed

+6
-1
lines changed

core/src/main/scala/org/apache/spark/SparkContext.scala

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -571,7 +571,12 @@ class SparkContext(config: SparkConf) extends Logging {
571571
_shutdownHookRef = ShutdownHookManager.addShutdownHook(
572572
ShutdownHookManager.SPARK_CONTEXT_SHUTDOWN_PRIORITY) { () =>
573573
logInfo("Invoking stop() from shutdown hook")
574-
stop()
574+
try {
575+
stop()
576+
} catch {
577+
case e: Throwable =>
578+
logWarning("Ignoring Exception while stopping SparkContext from shutdown hook", e)
579+
}
575580
}
576581
} catch {
577582
case NonFatal(e) =>

0 commit comments

Comments
 (0)