小编Mah*_*esh的帖子

线程“main”中的异常 java.lang.NoClassDefFoundError: org/apache/spark/sql/SQLContext

我使用的是 IntelliJ 2016.3 版本。

import sbt.Keys._
import sbt._

object ApplicationBuild extends Build {

  object Versions {
    val spark = "1.6.3"
  }

  val projectName = "example-spark"

  val common = Seq(
    version := "1.0",
    scalaVersion := "2.11.7"
  )

  val customLibraryDependencies = Seq(
    "org.apache.spark" %% "spark-core" % Versions.spark % "provided",
    "org.apache.spark" %% "spark-sql" % Versions.spark % "provided",
    "org.apache.spark" %% "spark-hive" % Versions.spark % "provided",
    "org.apache.spark" %% "spark-streaming" % Versions.spark % "provided",

    "org.apache.spark" %% "spark-streaming-kafka" % Versions.spark
      exclude("log4j", "log4j")
      exclude("org.spark-project.spark", "unused"),

    "com.typesafe.scala-logging" %% "scala-logging" …
Run Code Online (Sandbox Code Playgroud)

noclassdeffounderror apache-spark apache-spark-sql apache-spark-1.6

2
推荐指数
1
解决办法
4664
查看次数