如何从Spark RDD中的特定分区获取数据?

Vik*_*eek 6 apache-spark rdd

我想从Spark RDD中的特定分区访问数据.我可以得到一个分区的地址如下:

myRDD.partitions(0)
Run Code Online (Sandbox Code Playgroud)

但我想从myRDD.partitions(0)分区获取数据.我试过官方的org.apache.spark文档但找不到.

提前致谢.

zer*_*323 9

您可以使用mapPartitionsWithIndex如下

// Create (1, 1), (2, 2), ..., (100, 100) dataset
// and partition by key so we know what to expect
val rdd = sc.parallelize((1 to 100) map (i => (i, i)), 16)
  .partitionBy(new org.apache.spark.HashPartitioner(8))

val zeroth = rdd
  // If partition number is not zero ignore data
  .mapPartitionsWithIndex((idx, iter) => if (idx == 0) iter else Iterator())

// Check if we get expected results 8, 16, ..., 96
assert (zeroth.keys.map(_ % 8 == 0).reduce(_ & _) & zeroth.count == 12)
Run Code Online (Sandbox Code Playgroud)