为什么Spark with Play失败了"NoClassDefFoundError:无法初始化类org.apache.spark.SparkConf $"?

Fel*_*ipe 2 scala playframework apache-spark

我正在尝试使用这个项目(https://github.com/alexmasselot/spark-play-activator)作为Play和Spark示例的集成,以便在我的项目中执行相同的操作.因此,我创建了一个启动Spark的对象和一个使用RDD读取Json文件的Controller.下面是启动Spark的我的Object:

package bootstrap    
import org.apache.spark.sql.SparkSession    
object SparkCommons {    
  val sparkSession = SparkSession
    .builder
    .master("local")
    .appName("ApplicationController")
    .getOrCreate()
}
Run Code Online (Sandbox Code Playgroud)

我的build.sbt是这样的:

import play.sbt.PlayImport._

name := """crypto-miners-demo"""    
version := "1.0-SNAPSHOT"    
lazy val root = (project in file(".")).enablePlugins(PlayScala)    
scalaVersion := "2.12.4"

libraryDependencies += guice
libraryDependencies += evolutions
libraryDependencies += jdbc
libraryDependencies += filters
libraryDependencies += ws

libraryDependencies += "com.h2database" % "h2" % "1.4.194"
libraryDependencies += "com.typesafe.play" %% "anorm" % "2.5.3"
libraryDependencies += "org.scalatestplus.play" %% "scalatestplus-play" % "3.1.0" % Test

libraryDependencies += "com.typesafe.play" %% "play-slick" % "3.0.0"
libraryDependencies += "com.typesafe.play" %% "play-slick-evolutions" % "3.0.0"
libraryDependencies += "org.xerial" % "sqlite-jdbc" % "3.19.3"

libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.2.0"
libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "2.2.0"

dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-databind" % "2.8.7"
Run Code Online (Sandbox Code Playgroud)

但是当我尝试调用使用RDD的控制器时,我在Play框架上遇到此错误:

java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.SparkConf$
Run Code Online (Sandbox Code Playgroud)

我正在使用这样的RDD : val rdd = SparkCommons.sparkSession.read.json("downloads/tweet-json"). 我试图复制配置的应用程序运行良好.我只能将jackson-databindlib 导入我的build.sbt.我有一个错误,当我复制libraryDependencies ++= Dependencies.sparkAkkaHadoopivyScala := ivyScala.value map { _.copy(overrideScalaVersion = true) }我build.sbt.

Fel*_*ipe 6

我会在黑板上写上10万次,永远不会忘记.Spark 2.2.0仍然无法与Scala 2.12一起使用.我还编辑了Jackson lib版本.下面是我的build.sbt.

import play.sbt.PlayImport._

name := """crypto-miners-demo"""

version := "1.0-SNAPSHOT"

lazy val root = (project in file(".")).enablePlugins(PlayScala)

scalaVersion := "2.11.8"

libraryDependencies += guice
libraryDependencies += evolutions
libraryDependencies += jdbc
libraryDependencies += filters
libraryDependencies += ws

libraryDependencies += "com.h2database" % "h2" % "1.4.194"
libraryDependencies += "com.typesafe.play" %% "anorm" % "2.5.3"
libraryDependencies += "org.scalatestplus.play" %% "scalatestplus-play" % "3.1.0" % Test

libraryDependencies += "com.typesafe.play" %% "play-slick" % "3.0.0"
libraryDependencies += "com.typesafe.play" %% "play-slick-evolutions" % "3.0.0"
libraryDependencies += "org.xerial" % "sqlite-jdbc" % "3.19.3"

libraryDependencies += "org.apache.spark" %% "spark-core" % "2.2.0"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.0"

dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-databind" % "2.6.5"
Run Code Online (Sandbox Code Playgroud)