Che*_*ter 6 java apache-spark spark-streaming apache-spark-sql spark-dataframe
我正在努力将JavaRDD(字符串是JSON字符串)转换为数据帧并显示它.我正在做类似下面的事情,
public void call(JavaRDD<String> rdd, Time time) throws Exception {
if (rdd.count() > 0) {
JavaRDD<String> filteredRDD = rdd.filter(x -> x.length()>0);
sqlContext = SQLContextSingleton.getInstance(filteredRDD.context());
DataFrame df = sqlContext.read().schema(SchemaBuilder.buildSchema()).json(filteredRDD);
df.show();
}
}
Run Code Online (Sandbox Code Playgroud)
架构如下所示,
public static StructType buildSchema() {
StructType schema = new StructType(
new StructField[] { DataTypes.createStructField("student_id", DataTypes.StringType, false),
DataTypes.createStructField("school_id", DataTypes.IntegerType, false),
DataTypes.createStructField("teacher", DataTypes.StringType, true),
DataTypes.createStructField("rank", DataTypes.StringType, true),
DataTypes.createStructField("created", DataTypes.TimestampType, true),
DataTypes.createStructField("created_user", DataTypes.StringType, true),
DataTypes.createStructField("notes", DataTypes.StringType, true),
DataTypes.createStructField("additional_data", DataTypes.StringType, true),
DataTypes.createStructField("datetime", DataTypes.TimestampType, true) });
return (schema);
}
Run Code Online (Sandbox Code Playgroud)
上面的代码回复了我,
|student_id|school_id|teacher|rank|created|created_user|notes|additional_data|datetime|
+----------+------+--------+-----+-----------+-------+------------+--------+-------------+-----+-------------------+---------+---------------+--------+----+-------+-----------+
| null| null| null| null| null| null| null| null| null|
Run Code Online (Sandbox Code Playgroud)
但是,当我没有指定架构并创建Dataframe时,
DataFrame df = sqlContext.read().json(filteredRDD);
Run Code Online (Sandbox Code Playgroud)
这给我的结果如下,
|student_id|school_id|teacher|rank|created|created_user|notes|additional_data|datetime|
+----------+------+--------+-----+-----------+-------+------------+--------+-------------+-----+-------------------+---------+---------------+--------+----+-------+-----------+
| 1| 123| xxx| 3| 2017-06-02 23:49:10.410| yyyy| NULL| good academics| 2017-06-02 23:49:10.410|
Run Code Online (Sandbox Code Playgroud)
示例JSON记录:
{"student_id": "1","school_id": "123","teacher": "xxx","rank": "3","created": "2017-06-02 23:49:10.410","created_user":"yyyy","notes": "NULL","additional_date":"good academics","datetime": "2017-06-02 23:49:10.410"}
Run Code Online (Sandbox Code Playgroud)
对我做错的任何帮助?
问题是在我的json记录中,school_id是字符串类型,并且spark不能明确地从String转换为Integer。在这种情况下,它将整个记录视为空。我修改了架构以将school_id表示为StringType,从而解决了我的问题。在http://blog.antlypls.com/blog/2016/01/30/processing-json-data-with-sparksql/中提供了一些很好的解释