You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[SPARK-24547][K8S] Allow for building spark on k8s docker images without cache and don't forget to push spark-py container.
## What changes were proposed in this pull request?
https://issues.apache.org/jira/browse/SPARK-24547
TL;DR from JIRA issue:
- First time I generated images for 2.4.0 Docker was using it's cache, so actually when running jobs, old jars where still in the Docker image. This produces errors like this in the executors:
`java.io.InvalidClassException: org.apache.spark.storage.BlockManagerId; local class incompatible: stream classdesc serialVersionUID = 6155820641931972169, local class serialVersionUID = -3720498261147521051`
- The second problem was that the spark container is pushed, but the spark-py container wasn't yet. This was just forgotten in the initial PR.
- A third problem I also ran into because I had an older docker was apache#21551 so I have not included a fix for that in this ticket.
## How was this patch tested?
I've tested it on my own Spark on k8s deployment.
Author: Ray Burgemeestre <[email protected]>
Closesapache#21555 from rayburgemeestre/SPARK-24547.
0 commit comments