Scala Spark 版本兼容性

syv*_*syv 1 scala apache-spark

我试图在 IntelliJ IDE 中配置 Scala

我机器上的 Scala 和 Spark 版本

Welcome to Scala 2.12.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_121).

apache-spark/2.2.1
Run Code Online (Sandbox Code Playgroud)

SBT文件

scalaVersion := "2.12.5"
resolvers  += "MavenRepository" at "http://central.maven.org/maven2"

libraryDependencies ++= {
  val sparkVersion = "2.2.1"
    Seq( "org.apache.spark" %% "spark-core" % sparkVersion)
}
Run Code Online (Sandbox Code Playgroud)

我得到的错误

Error:Error while importing SBT project:<br/>...<br/><pre>[info] Resolving jline#jline;2.14.5 ...
[error] (*:ssExtractDependencies) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.12;2.2.1: not found
[error] unresolved dependency: org.apache.spark#spark-core_2.12;1.4.0: not found
[error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.12;2.2.1: not found
[error] unresolved dependency: org.apache.spark#spark-core_2.12;1.4.0: not found
Run Code Online (Sandbox Code Playgroud)

Ram*_*jan 6

没有您在 sbt 项目中定义的 spark core 版本可供下载。您可以检查maven 依赖项以获取有关可用版本的更多信息

如您所见,对于 spark-core 版本 2.2.1,要下载的最新版本在 Scala 2.11 info here 中编译

所以

要么将 sbt 构建文件更改为

scalaVersion := "2.11.8"
resolvers  += "MavenRepository" at "http://central.maven.org/maven2"

libraryDependencies ++= {
  val sparkVersion = "2.2.1"
    Seq( "org.apache.spark" %% "spark-core" % sparkVersion)
}
Run Code Online (Sandbox Code Playgroud)

将依赖构建的版本定义为

libraryDependencies ++= {
  val sparkVersion = "2.2.1"
    Seq("org.apache.spark" % "spark-core_2.11" % sparkVersion)
}
Run Code Online (Sandbox Code Playgroud)

我希望答案有帮助