Skip to content

Commit 7522b5f

Browse files
authored
Fix RDDOperationScope serialization issues (apache-spark-on-k8s#379)
Follow up to jackson 2.9.5 upgrade where RDDOperationScope can't find the serializer for the Option (parent variable) and serializes it using defaults. This force jackson to use static typing, i.e. use the class types and not values determined using reflection at runtime
1 parent d5e10da commit 7522b5f

File tree

1 file changed

+2
-0
lines changed

1 file changed

+2
-0
lines changed

core/src/main/scala/org/apache/spark/rdd/RDDOperationScope.scala

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -22,6 +22,7 @@ import java.util.concurrent.atomic.AtomicInteger
2222
import com.fasterxml.jackson.annotation.{JsonIgnore, JsonInclude, JsonPropertyOrder}
2323
import com.fasterxml.jackson.annotation.JsonInclude.Include
2424
import com.fasterxml.jackson.databind.ObjectMapper
25+
import com.fasterxml.jackson.databind.annotation.JsonSerialize
2526
import com.fasterxml.jackson.module.scala.DefaultScalaModule
2627
import com.google.common.base.Objects
2728

@@ -45,6 +46,7 @@ import org.apache.spark.internal.Logging
4546
@JsonPropertyOrder(Array("id", "name", "parent"))
4647
private[spark] class RDDOperationScope(
4748
val name: String,
49+
@JsonSerialize(typing = JsonSerialize.Typing.STATIC)
4850
val parent: Option[RDDOperationScope] = None,
4951
val id: String = RDDOperationScope.nextScopeId().toString) {
5052

0 commit comments

Comments
 (0)