Spark Streaming窗口操作

use*_*478 6 distributed scala apache-spark spark-streaming

以下是在30秒的窗口大小和10秒的幻灯片大小上获得单词计数的简单代码.

import org.apache.spark.SparkConf
import org.apache.spark.streaming._
import org.apache.spark.streaming.StreamingContext._
import org.apache.spark.api.java.function._
import org.apache.spark.streaming.api._
import org.apache.spark.storage.StorageLevel

val ssc = new StreamingContext(sc, Seconds(5))

// read from text file
val lines0 = ssc.textFileStream("test")
val words0 = lines0.flatMap(_.split(" "))

// read from socket
val lines1 = ssc.socketTextStream("localhost", 9999, StorageLevel.MEMORY_AND_DISK_SER)
val words1 = lines1.flatMap(_.split(" "))

val words = words0.union(words1)
val wordCounts = words.map((_, 1)).reduceByKeyAndWindow(_ + _, Seconds(30), Seconds(10))

wordCounts.print()
ssc.checkpoint(".")
ssc.start()
ssc.awaitTermination()
Run Code Online (Sandbox Code Playgroud)

但是,我从这一行得到错误:

val wordCounts = words.map((_, 1)).reduceByKeyAndWindow(_ + _, Seconds(30), Seconds(10))
Run Code Online (Sandbox Code Playgroud)

.特别是来自_ + _.错误是

51: error: missing parameter type for expanded function ((x$2, x$3) => x$2.$plus(x$3))
Run Code Online (Sandbox Code Playgroud)

谁能告诉我问题是什么?谢谢!

aar*_*man 10

这非常容易修复,只需明确类型即可.
val wordCounts = words.map((_, 1)).reduceByKeyAndWindow((a:Int,b:Int)=>a+b, Seconds(30), Seconds(10))

这个答案中解释了scala无法推断出这种情况下的类型的原因