-
Notifications
You must be signed in to change notification settings - Fork 78
Class not found: org.apache.spark.Logging (due to mismatch in spark version?) #134
Description
For a proof of concept, I created a docker instance of RabbitMq3.8.4 to which I submitted some helloworld style messages in queue "hello". I would like to see those in Spark.
Using the RabbitMQUtils.createStream, I get following exception( which according to stackoverflow https://stackoverflow.com/questions/40287289/java-lang-noclassdeffounderror-org-apache-spark-logging has to do that the Logging class is only available for spark 1.5.2 ; I had expected that since the com.stratio.receiver:spark-rabbitmq_1.6:0.4.0 package should support Spark 2.0+ and I am using Spark 2.4.4 I should not face this issue.
This is the exact error:
java.lang.NoClassDefFoundError: org/apache/spark/Logging
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.spark.streaming.rabbitmq.RabbitMQUtils$.createStream(RabbitMQUtils.scala:61)
... 61 elided
Caused by: java.lang.ClassNotFoundException: org.apache.spark.Logging
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
This is how I enter the spark shell (2.4.4 on scala 2.11.12):
spark-shell --packages com.stratio.receiver:spark-rabbitmq_1.6:0.4.0
This is the code entered in the impala-shell:
import org.apache.spark.streaming.rabbitmq.RabbitMQUtils
import org.apache.spark._
import org.apache.spark.streaming._
val ssc = new StreamingContext(sc, Seconds(1))
//RabbitMQ parameters
val host = "localhost"
val port = "5672"
val queueName = "hello"
val receiverStream = RabbitMQUtils.createStream[String](ssc, Map(
"host" -> host,
"port" -> port,
"queueName" -> queueName
))