The*_*ark 6 scala intellij-idea sbt apache-spark
我收到此错误的原因是什么?最初,Scala的IDE插件是2.12.3.但由于我正在使用Spark 2.2.0,我手动将其更改为Scala 2.11.11.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/09/19 12:08:19 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)V
at scala.xml.Null$.<init>(Null.scala:23)
at scala.xml.Null$.<clinit>(Null.scala)
at org.apache.spark.ui.jobs.AllJobsPage.<init>(AllJobsPage.scala:39)
at org.apache.spark.ui.jobs.JobsTab.<init>(JobsTab.scala:38)
at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:67)
at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:84)
at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:221)
at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:163)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:452)
at sparkEnvironment$.<init>(Ticket.scala:33)
at sparkEnvironment$.<clinit>(Ticket.scala)
at Ticket$.main(Ticket.scala:39)
at Ticket.main(Ticket.scala)
Run Code Online (Sandbox Code Playgroud)
Aka*_*thi 12
您不能将scala版本2.12系列与任何版本的spark一起使用.
您可以尝试使用2.11Scala系列Scala,但要确保spark与相应的scala版本兼容.即
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.2.0"
正如您在此依赖项中看到的那样spark-core_2.11与scala版本相关联2.11.
或者您可以使用此依赖项libraryDependencies += "org.apache.spark" %% "spark-core" % "2.2.0",它将自动推断scala版本
希望这很清楚
谢谢
| 归档时间: |
|
| 查看次数: |
8690 次 |
| 最近记录: |