计算pyspark Dataframe中的列数?

Sus*_*rti -1 machine-learning apache-spark pyspark pyspark-sql

我有一个包含15列的数据框(4个分类,其余为数字).

我为每个分类变量创建了虚拟变量.现在我想在新数据帧中找到变量的数量.

我试过计算长度printSchema(),但是NoneType:

print type(df.printSchema())
Run Code Online (Sandbox Code Playgroud)

Rak*_*mar 9

你找错了方法,以下是关于printSchema的示例: -

df = sqlContext.createDataFrame([
    (1, "A", "X1"),
    (2, "B", "X2"),
    (3, "B", "X3"),
    (1, "B", "X3"),
    (2, "C", "X2"),
    (3, "C", "X2"),
    (1, "C", "X1"),
    (1, "B", "X1"),
], ["ID", "TYPE", "CODE"])



print len(df.columns) #3
Run Code Online (Sandbox Code Playgroud)

columns提供所有列的列表,我们可以检查len.而是printSchema打印具有列及其数据类型的df模式,如下所示: -

root
 |-- ID: long (nullable = true)
 |-- TYPE: string (nullable = true)
 |-- CODE: string (nullable = true)
Run Code Online (Sandbox Code Playgroud)