相关疑难解决方法(0)

为什么Scala编译器失败,"包中的对象SparkConf无法在org.apache.spark包中访问"?

我无法访问SparkConf包中.但我已经导入了import org.apache.spark.SparkConf.我的代码是:

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.rdd.RDD

import org.apache.spark._
import org.apache.spark.streaming._
import org.apache.spark.streaming.StreamingContext._

object SparkStreaming {
    def main(arg: Array[String]) = {

        val conf = new SparkConf.setMaster("local[2]").setAppName("NetworkWordCount")
        val ssc = new StreamingContext( conf, Seconds(1) )

        val lines = ssc.socketTextStream("localhost", 9999)
        val words = lines.flatMap(_.split(" "))
        val pairs_new = words.map( w => (w, 1) )
        val wordsCount = pairs_new.reduceByKey(_ + _)
        wordsCount.print() 

        ssc.start() // Start the computation
        ssc.awaitTermination() // Wait for the computation to the …
Run Code Online (Sandbox Code Playgroud)

scala sbt apache-spark

2
推荐指数
1
解决办法
5307
查看次数

标签 统计

apache-spark ×1

sbt ×1

scala ×1