从火花数据框中选择最新记录

use*_*587 4 apache-spark-sql

我有 DataDram 看起来像这样:

+-------+---------+
|email  |timestamp|
+-------+---------+
|x@y.com|        1|
|y@m.net|        2|
|z@c.org|        3|
|x@y.com|        4|
|y@m.net|        5|
|    .. |       ..|
+-------+---------+
Run Code Online (Sandbox Code Playgroud)

对于每封电子邮件,我都想保留最新记录,因此结果将是:

+-------+---------+
|email  |timestamp|
+-------+---------+
|x@y.com|        4|
|y@m.net|        5|
|z@c.org|        3|
|    .. |       ..|
+-------+---------+
Run Code Online (Sandbox Code Playgroud)

我怎样才能做到这一点?我是火花和数据框的新手。

Tim*_*sen 7

这是一个适用于 Spark SQL 的通用 ANSI SQL 查询:

SELECT email, timestamp
FROM
(
    SELECT t.*, ROW_NUMBER() OVER (PARTITION BY email ORDER BY timestamp DESC) rn
    FROM yourTable t
) t
WHERE rn = 1;
Run Code Online (Sandbox Code Playgroud)

对于 PySpark 数据框代码,请尝试以下操作:

from pyspark.sql.window import Window

df = yourDF
    .withColumn("rn", F.row_number()
        .over(Window.partitionBy("email")
        .orderBy(F.col("timestamp").desc())

df = df.filter(F.col("rn") == 1).drop("rn")
df.show()
Run Code Online (Sandbox Code Playgroud)