yey*_*ilk 13 scala apache-spark
在我的项目中,我的外部库是spark-assembly-1.3.1-hadoop2.6.0,如果我按'.',IDE会通知我toDF(),但它告诉我toDF()在编码时无法解析符号.对不起我toDF()在Apache Spark中找不到文档.
case class Feature(name:String, value:Double, time:String, period:String)
val RESRDD = RDD.map(tuple => {
var bson=new BasicBSONObject();
bson.put("name",name);
bson.put("value",value);
(null,bson);
})
RESRDD
.map(_._2)
.map(f => Feature(f.get("name").toString, f.get("value").toString.toDouble))
.toDF()
Run Code Online (Sandbox Code Playgroud)
zer*_*323 29
为了能够使用toDF你必须先导入sqlContext.implicits:
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
import sqlContext.implicits._
case class Foobar(foo: String, bar: Integer)
val foobarRdd = sc.parallelize(("foo", 1) :: ("bar", 2) :: ("baz", -1) :: Nil).
map { case (foo, bar) => Foobar(foo, bar) }
val foobarDf = foobarRdd.toDF
foobarDf.limit(1).show
Run Code Online (Sandbox Code Playgroud)
| 归档时间: |
|
| 查看次数: |
21917 次 |
| 最近记录: |