ben*_*ben 2 apache-spark-sql pyspark
我需要从给定日期获取周开始日期和周结束日期,考虑到一周从星期日开始到星期六结束。
我提到了这篇文章,但这需要星期一作为一周的开始日。spark中是否有任何内置功能可以解决这个问题?
找出星期几并使用selectExpr遍历列,并将星期日作为星期开始日期
from pyspark.sql import functions as F
df_b = spark.createDataFrame([('1','2020-07-13')],[ "ID","date"])
df_b = df_b.withColumn('day_of_week', F.dayofweek(F.col('date')))
df_b = df_b.selectExpr('*', 'date_sub(date, day_of_week-1) as week_start')
df_b = df_b.selectExpr('*', 'date_add(date, 7-day_of_week) as week_end')
df_b.show()
+---+----------+-----------+----------+----------+
| ID| date|day_of_week|week_start| week_end|
+---+----------+-----------+----------+----------+
| 1|2020-07-13| 2|2020-07-12|2020-07-18|
+---+----------+-----------+----------+----------+
Run Code Online (Sandbox Code Playgroud)
Spark SQL 中的更新
首先从数据框中创建一个临时视图
df_a.createOrReplaceTempView("df_a_sql")
Run Code Online (Sandbox Code Playgroud)
代码在这里
%sql
select *, date_sub(date,dayofweek-1) as week_start,
date_sub(date, 7-dayofweek) as week_end
from
(select *, dayofweek(date) as dayofweek
from df_a_sql) T
Run Code Online (Sandbox Code Playgroud)
输出
+---+----------+-----------+----------+----------+
| ID| date|day_of_week|week_start| week_end|
+---+----------+-----------+----------+----------+
| 1|2020-07-13| 2|2020-07-12|2020-07-18|
+---+----------+-----------+----------+----------+
Run Code Online (Sandbox Code Playgroud)
| 归档时间: |
|
| 查看次数: |
2058 次 |
| 最近记录: |