相关疑难解决方法(0)

sbt运行后SparkContext未初始化

我的build.sbt文件如下:

name := "hello"

version := "1.0"

scalaVersion := "2.11.8"

val sparkVersion = "1.6.1"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % sparkVersion,
  "org.apache.spark" %% "spark-streaming" % sparkVersion,
  "org.apache.spark" %% "spark-streaming-twitter" % sparkVersion
)
Run Code Online (Sandbox Code Playgroud)

我也有example.scala src/main/scala/example.scala:

import org.apache.spark._
import org.apache.spark.SparkContext._

object WordCount {
    def main(args: Array[String]) {
      val conf = new SparkConf().setAppName("wordCount").setMaster("local")
      val sc = new SparkContext(conf)
      val input =  sc.textFile("food.txt")
      val words = input.flatMap(line => line.split(" "))
      val counts = words.map(word => (word, 1)).reduceByKey{case (x, y) => …
Run Code Online (Sandbox Code Playgroud)

scala sbt

1
推荐指数
1
解决办法
789
查看次数

标签 统计

sbt ×1

scala ×1