使用 Spark 和 IntelliJ 时出现 NoSuchMethodError

lar*_*ars 4 jvm scala intellij-idea apache-spark

我是 Scala 和 Spark 的新手。我对使用 IntelliJ 进行工作如此困难感到沮丧。目前,我无法运行下面的代码。我确信这很简单,但我无法让它发挥作用。

我正在尝试运行:

import org.apache.spark.{SparkConf, SparkContext}

object TestScala {
  def main(args: Array[String]): Unit = {
    val conf = new SparkConf()
    conf.setAppName("Datasets Test")
    conf.setMaster("local[2]")
    val sc = new SparkContext(conf)
    println(sc)
  }
}
Run Code Online (Sandbox Code Playgroud)

我得到的错误是:

Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
at org.apache.spark.util.Utils$.getCallSite(Utils.scala:1413)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:77)
at TestScala$.main(TestScala.scala:13)
at TestScala.main(TestScala.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
Run Code Online (Sandbox Code Playgroud)

我的 build.sbt 文件:

name := "sparkBook"

version := "1.0"

scalaVersion := "2.12.1"
Run Code Online (Sandbox Code Playgroud)

You*_*ice 5

更改scalaVersion2.11.8并将 Spark 依赖项添加到您的build.sbt

libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.2"