Cor*_*ave 4 java scala apache-spark
我用Spark快速入门指南执行了用Java编写的简单代码:
public static void main(String[] args) {
        SparkConf conf = new SparkConf().setAppName("Simple Application").setMaster("local[4]");
        JavaSparkContext sc = new JavaSparkContext(conf);
        Accumulator<Integer> counter = sc.accumulator(0);
        List<Integer> data = Arrays.asList(1, 2, 3, 4, 5);
        JavaRDD<Integer> rdd = sc.parallelize(data);
        rdd.foreach(counter::add);
        System.out.println("Counter value " + counter);
}
它"Counter value 15"按预期打印.我有用Scala写的相同逻辑的代码:
object Counter extends App {
    val conf = new SparkConf().setAppName("Simple Application").setMaster("local[4]")
    val sc = new SparkContext(conf)
    val counter = sc.accumulator(0)
    val data = Array(1, 2, 3, 4, 5)
    val rdd = sc.parallelize(data)
    rdd.foreach(x => counter += x)
    println(s"Counter value: $counter")
}
但它每次打印不正确的结果(<15).我的Scala代码有什么问题?
Java spark lib "org.apache.spark:spark-core_2.10:1.6.1"
Scala spark lib "org.apache.spark" %% "spark-core" % "1.6.1"
快速入门文档中的建议说:
请注意,应用程序应定义main()方法,而不是扩展scala.App.scala.App的子类可能无法正常工作.
也许这就是问题?
试试:
object Counter {
    def main(args: Array[String]): Unit = {
      val conf = new SparkConf().setAppName("Simple Application").setMaster("local[4]")
      val sc = new SparkContext(conf)
      val counter = sc.accumulator(0)
      val data = Array(1, 2, 3, 4, 5)
      val rdd = sc.parallelize(data)
      rdd.foreach(x => counter += x)
      println(s"Counter value: $counter")
    }
}
| 归档时间: | 
 | 
| 查看次数: | 358 次 | 
| 最近记录: |