use*_*470 22 scala intellij-idea apache-spark
我是Spark和Scala的新手.我用SBT创建了一个IntelliJ Scala项目,并为build.sbt添加了几行.
name := "test-one"
version := "1.0"
scalaVersion := "2.11.2"
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.1.0"
Run Code Online (Sandbox Code Playgroud)
我的Scala版本是2.10.4,但2.11.2也会出现这个问题
Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class
at akka.util.Collections$EmptyImmutableSeq$.<init>(Collections.scala:15)
at akka.util.Collections$EmptyImmutableSeq$.<clinit>(Collections.scala)
at akka.japi.Util$.immutableSeq(JavaAPI.scala:209)
at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:150)
at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:470)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:153)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
at TweeProcessor$.main(TweeProcessor.scala:10)
at TweeProcessor.main(TweeProcessor.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
Caused by: java.lang.ClassNotFoundException: scala.collection.GenTraversableOnce$class
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 23 more
Run Code Online (Sandbox Code Playgroud)
尝试在线查找,大多数答案都指向API版本和Scala版本之间的不匹配,但没有一个特定于Spark.
lmm*_*lmm 23
spark-core_2.10用于2.10.x版本的scala.你应该用
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.1.0"
Run Code Online (Sandbox Code Playgroud)
这将为您的scala版本选择正确的_2.10或_2.11版本.
还要确保您正在编译相同版本的scala和spark,就像运行它的群集一样.
Bli*_*ieg 10
将scala版本降级为2.10.4
name := "test-one"
version := "1.0"
//scalaVersion := "2.11.2"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.1.0"
Run Code Online (Sandbox Code Playgroud)
| 归档时间: |
|
| 查看次数: |
27249 次 |
| 最近记录: |