I am using Apache Spark Java API and Mongo Spark Connector in PlayFramework 2.8.7
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "3.0.1",
"org.apache.spark" %% "spark-sql" % "3.0.1",
"org.apache.spark" %% "spark-mllib" % "3.0.1",
"org.apache.spark" %% "spark-streaming" % "3.0.1" % "provided",
"org.mongodb.spark" %% "mongo-spark-connector" % "3.0.1" exclude ("org.mongodb", "mongo-java-driver"),
)
After updated from “mongo-spark-connector” version 3.0.0 to version 3.0.1, I got a “local class incompatible” error:
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 10.0 failed 4 times, most recent failure: Lost task 0.3 in stage 10.0 (TID 26, 172.18.0.2, executor 1): java.io.InvalidClassException: com.mongodb.spark.MongoSpark$; local class incompatible: stream classdesc serialVersionUID = -148646310337786170, local class serialVersionUID = -3005450305892693805
This normally happens when a class object implements Serializable doesn’t define the serialVersionUID in Java.
But this error doesn’t occur in Spark Connector Version 3.0.0. I appreciate any hints and help.