zor*_*ork 13 scala apache-spark
可以在RDD中压缩列表吗?例如转换:
val xxx: org.apache.spark.rdd.RDD[List[Foo]]
Run Code Online (Sandbox Code Playgroud)
至:
val yyy: org.apache.spark.rdd.RDD[Foo]
Run Code Online (Sandbox Code Playgroud)
这该怎么做?
Shy*_*nki 15
val rdd = sc.parallelize(Array(List(1,2,3), List(4,5,6), List(7,8,9), List(10, 11, 12)))
// org.apache.spark.rdd.RDD[List[Int]] = ParallelCollectionRDD ...
val rddi = rdd.flatMap(list => list)
// rddi: org.apache.spark.rdd.RDD[Int] = FlatMappedRDD ...
// which is same as rdd.flatMap(identity)
// identity is a method defined in Predef object.
// def identity[A](x: A): A
rddi.collect()
// res2: Array[Int] = Array(1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12)
Run Code Online (Sandbox Code Playgroud)
maa*_*asg 13
你只需要展平它,但由于在RDD上没有明确的"扁平"方法,你可以这样做:
rdd.flatMap(identity)
Run Code Online (Sandbox Code Playgroud)