boY*_*boY 9 scala apache-spark apache-spark-sql
我执行以下操作:
val tempDict = sqlContext.sql("select words.pName_token,collect_set(words.pID) as docids
from words
group by words.pName_token").toDF()
val wordDocs = tempDict.filter(newDict("pName_token")===word)
val listDocs = wordDocs.map(t => t(1)).collect()
listDocs: Array
[Any] = Array(WrappedArray(123, 234, 205876618, 456))
Run Code Online (Sandbox Code Playgroud)
我的问题是如何迭代这个包装的数组或将其转换为列表?
我得到的选项listDocs
ARE apply
, asInstanceOf
, clone
, isInstanceOf
, length
, toString
,和 update
.
我该怎么办?
这是解决这个问题的一种方法.
import org.apache.spark.sql.Row
import org.apache.spark.sql.functions._
import scala.collection.mutable.WrappedArray
val data = Seq((Seq(1,2,3),Seq(4,5,6),Seq(7,8,9)))
val df = sqlContext.createDataFrame(data)
val first = df.first
// use a pattern match to deferral the type
val mapped = first.getAs[WrappedArray[Int]](0)
// now we can use it like normal collection
mapped.mkString("\n")
// get rows where has array
val rows = df.collect.map {
case Row(a: Seq[Any], b: Seq[Any], c: Seq[Any]) =>
(a, b, c)
}
rows.mkString("\n")
Run Code Online (Sandbox Code Playgroud)
归档时间: |
|
查看次数: |
14452 次 |
最近记录: |