Skip to content

Commit 26e6645

Browse files
icexellossHyukjinKwon
authored andcommitted
[SPARK-22655][PYSPARK] Throw exception rather than exit silently in PythonRunner when Spark session is stopped
## What changes were proposed in this pull request? During Spark shutdown, if there are some active tasks, sometimes they will complete with incorrect results. The issue is in PythonRunner where it is returning partial result instead of throwing exception during Spark shutdown. This patch makes it so that these tasks fail instead of complete with partial results. ## How was this patch tested? Existing tests. Author: Li Jin <[email protected]> Closes #19852 from icexelloss/python-runner-shutdown.
1 parent f28b1a4 commit 26e6645

File tree

1 file changed

+0
-4
lines changed

1 file changed

+0
-4
lines changed

core/src/main/scala/org/apache/spark/api/python/PythonRunner.scala

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -317,10 +317,6 @@ private[spark] abstract class BasePythonRunner[IN, OUT](
317317
logDebug("Exception thrown after task interruption", e)
318318
throw new TaskKilledException(context.getKillReason().getOrElse("unknown reason"))
319319

320-
case e: Exception if env.isStopped =>
321-
logDebug("Exception thrown after context is stopped", e)
322-
null.asInstanceOf[OUT] // exit silently
323-
324320
case e: Exception if writerThread.exception.isDefined =>
325321
logError("Python worker exited unexpectedly (crashed)", e)
326322
logError("This may have been caused by a prior exception:", writerThread.exception.get)

0 commit comments

Comments
 (0)