Spark NullPointerException

Aks*_*mar 1 nullpointerexception apache-spark

我的Spark代码如下所示: -

val logData = sc.textFile("hdfs://localhost:9000/home/akshat/recipes/recipes/simplyrecipes/*/*/*/*")


def doSomething(line: String): (Long,Long) = { 


 val numAs = logData.filter(line => line.contains("a")).count();


 val numBs = logData.filter(line => line.contains("b")).count();
 return (numAs,numBs)

}

 val mapper = logData.map(doSomething _)

 val save = mapper.saveAsTextFile("hdfs://localhost:9000/home/akshat/output3")
Run Code Online (Sandbox Code Playgroud)

mapper是类型org.apache.spark.rdd.RDD[(Long, Long)] = MappedRDD 当我尝试执行saveAsTextFile操作时,它给出了一个错误 java.lang.NullPointerException

我做错了什么以及我应该做些什么改变来纠正这个例外?
提前致谢!

Dav*_*fin 5

你不应该logData从内部引用doSomething.这是问题所在.我无法确切地说出你想要做什么,但是如果你要做的就是用"a"计算行数,你就不需要这样做def,只需这样做:

val numAs = logData.filter(line => line.contains("a")).count();
val numBs = logData.filter(line => line.contains("b")).count();
Run Code Online (Sandbox Code Playgroud)

另一方面,如果你想在每一行中计算"a"和"b",并为每个输入写出一行,那么试试这个:

def doSomething(line: String): (Int,Int) = {
  val numAs = line.count(ch => ch.equals("a"))
  val numBs = line.count(ch => ch.equals("b"))
  (numAs, numBs)
}
Run Code Online (Sandbox Code Playgroud)