小编use*_*570的帖子

maven Artifacts spark-core_2.10和spark-core_2.11之间的星火差异

有谁知道两个火花核心工件之间的区别?两者都支持Spark达到1.5.1版.但是,在我的Java应用程序中使用spark-core_2.11并尝试使用spark url(spark:// masterIP:Port)连接到独立集群(Spark-1.5.1)时,我最终会遇到相同的错误.

WARN ReliableDeliverySupervisor: Association with remote system [akka.tcp://sparkMaster@192.168.188.20:7077] has failed, address is now gated for [5000] ms. Reason: [Disassociated] 
ERROR SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[appclient-registration-retry-thread,5,main]
Run Code Online (Sandbox Code Playgroud)

Master还在其日志中记录兼容性问题:

ERROR Remoting: org.apache.spark.deploy.DeployMessages$RegisterApplication; local class incompatible: stream classdesc serialVersionUID = 352674063933172066, local class serialVersionUID = -5495080032843259921
java.io.InvalidClassException: org.apache.spark.deploy.DeployMessages$RegisterApplication; local class incompatible: stream classdesc serialVersionUID = 352674063933172066, local class serialVersionUID = -5495080032843259921
    at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:621)
    at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1623)
    at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
    at akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
    at akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)
    at akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104) …
Run Code Online (Sandbox Code Playgroud)

maven apache-spark

6
推荐指数
1
解决办法
2544
查看次数

标签 统计

apache-spark ×1

maven ×1