由于scala 2.12.1,IntelliJ中的Spark字计数错误

use*_*guy 1 scala intellij-idea apache-spark

线程"main"中的异常java.lang.NoClassDefFoundError:org.apache.spark.SparkConf中的scala/Product $ class $ DeprecatedConfig.(SparkConf.scala:762)org.apache.spark.SparkConf $.(SparkConf.scala: 615)org.apache.spark.SparkConf $.(SparkConf.scala)atg.apache.spark.SparkConf.set(SparkConf.scala:84)at org.apache.spark.SparkConf.set(SparkConf.scala:73 )org.apache.spark.SparkConf.setMaster(SparkConf.scala:105)

package bigdata.spark_applications
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
object WordCount {
  def main(args: Array[String]): Unit = {
    val conf = new SparkConf().setMaster("local").setAppName("WordCount")
    val sc = new SparkContext(conf)
    val data = sc.textFile("C:\\Users\\scala.txt")
    val result = data.flatMap(_.split(" ")).map(word => (word , 1)).reduceByKey(_ + _)
    result.collect.foreach(println)
  }
Run Code Online (Sandbox Code Playgroud)

use*_*guy 5

你好我得到了答案,

我使用scala 2.12.1并且spark-core不适用于2.12.1所以在项目中我使用scala 2.11.8并将spark-core更改为依赖于2.11

版本:="1.0"

scalaVersion:="2.11.8"

libraryDependencies + ="org.apache.spark"%"spark-core_2.11"%"2.1.0"libraryDependencies + ="org.apache.spark"%"spark-sql_2.11"%"2.1.0"