使用groupBy在Spark列中获取模式(最常见)值

Gau*_*sal 4 pyspark sparkr spark-dataframe

我有一个SparkR DataFrame,我想(最常见)value获取每个唯一模式name。我怎样才能做到这一点?似乎没有内置mode功能。SparkR或PySpark解决方案都可以。

#Create DF
df <- data.frame(name = c("Thomas", "Thomas", "Thomas", "Bill", "Bill", "Bill"),
  value = c(5, 5, 4, 3, 3, 7))
DF <- createDataFrame(df)

name   | value
-----------------
Thomas |  5
Thomas |  5
Thomas |  4
Bill   |  3
Bill   |  3
Bill   |  9

#What I want to get
name   | mode(value)
-----------------
Thomas |   5
Bill   |   3 
Run Code Online (Sandbox Code Playgroud)

Kon*_*ewa 5

您可以使用.groupBy()window方法的组合来实现这一点,例如:

grouped = df.groupBy('name', 'value').count()
window = Window.partitionBy("name").orderBy(desc("count"))
grouped\
    .withColumn('order', row_number().over(window))\
    .where(col('order') == 1)\
    .show()
Run Code Online (Sandbox Code Playgroud)

输出:

+------+-----+-----+-----+
|  name|value|count|order|
+------+-----+-----+-----+
|  Bill|    3|    2|    1|
|Thomas|    5|    2|    1|
+------+-----+-----+-----+
Run Code Online (Sandbox Code Playgroud)