将List转换为DataFrame Spark Scala

sen*_*r p 5 scala apache-spark apache-spark-sql spark-dataframe

我有超过30个字符串的列表。如何将列表转换为数据框。我试过的

例如

Val list=List("a","b","v","b").toDS().toDF()

Output :


+-------+
|  value|
+-------+
|a      |
|b      |
|v      |
|b      |
+-------+


Expected Output is 


  +---+---+---+---+
| _1| _2| _3| _4|
+---+---+---+---+
|  a|  b|  v|  a|
+---+---+---+---+
Run Code Online (Sandbox Code Playgroud)

任何对此的帮助。

Sri*_*asR 5

List("a","b","c","d") 表示具有一个字段的记录,因此结果集在每一行中显示一个元素。

为了获得预期的输出,该行应具有四个字段/元素。因此,我们将列表环绕起来,该列表List(("a","b","c","d"))代表一行,包含四个字段。以类似的方式,包含两行的列表List(("a1","b1","c1","d1"),("a2","b2","c2","d2"))

scala> val list = sc.parallelize(List(("a", "b", "c", "d"))).toDF()
list: org.apache.spark.sql.DataFrame = [_1: string, _2: string, _3: string, _4: string]

scala> list.show
+---+---+---+---+
| _1| _2| _3| _4|
+---+---+---+---+
|  a|  b|  c|  d|
+---+---+---+---+


scala> val list = sc.parallelize(List(("a1","b1","c1","d1"),("a2","b2","c2","d2"))).toDF
list: org.apache.spark.sql.DataFrame = [_1: string, _2: string, _3: string, _4: string]

scala> list.show
+---+---+---+---+
| _1| _2| _3| _4|
+---+---+---+---+
| a1| b1| c1| d1|
| a2| b2| c2| d2|
+---+---+---+---+
Run Code Online (Sandbox Code Playgroud)


小智 5

为了使用 toDF,我们必须导入

import spark.sqlContext.implicits._
Run Code Online (Sandbox Code Playgroud)

请参考以下代码

val spark = SparkSession.
builder.master("local[*]")
  .appName("Simple Application")
.getOrCreate()

import spark.sqlContext.implicits._

val lstData = List(List("vks",30),List("harry",30))
val mapLst = lstData.map{case List(a:String,b:Int) => (a,b)}
val lstToDf = spark.sparkContext.parallelize(mapLst).toDF("name","age")
lstToDf.show

val llist = Seq(("bob", "2015-01-13", 4), ("alice", "2015-04- 23",10)).toDF("name","date","duration")
llist.show
Run Code Online (Sandbox Code Playgroud)