DATEADD 的 Pyspark 实现
Posted
技术标签:
【中文标题】DATEADD 的 Pyspark 实现【英文标题】:Pyspark implementation of DATEADD 【发布时间】:2019-02-22 11:17:30 【问题描述】:我的 T-SQL 代码如下所示
cast( dateadd(minute, - 240, tmp_view_tos_lenelgate_qry11.eventdate) as date
如何在PYSPARK中实现DATE_ADD
功能?
【问题讨论】:
【参考方案1】:# Creating the DataFrame
df = spark.createDataFrame([('2014-02-13 12:36:52.721',),('2018-01-01 00:30:50.001',)], ['eventdate'])
df = df.withColumn('eventdate', col('eventdate').cast('timestamp'))
df.show(truncate=False)
+-----------------------+
|eventdate |
+-----------------------+
|2014-02-13 12:36:52.721|
|2018-01-01 00:30:50.001|
+-----------------------+
df.printSchema()
root
|-- eventdate: timestamp (nullable = true)
# Subtract 240 minutes/240*60=14400 seconds from 'eventdate'
from pyspark.sql.functions import col, unix_timestamp
df = df.withColumn('eventdate_new', (unix_timestamp('eventdate') - 240*60).cast('timestamp'))
df.show(truncate=False)
+-----------------------+-------------------+
|eventdate |eventdate_new |
+-----------------------+-------------------+
|2014-02-13 12:36:52.721|2014-02-13 08:36:52|
|2018-01-01 00:30:50.001|2017-12-31 20:30:50|
+-----------------------+-------------------+
【讨论】:
以上是关于DATEADD 的 Pyspark 实现的主要内容,如果未能解决你的问题,请参考以下文章
HGDB创建同名同参函数实现SqlServer中的DATEADD函数
将 dateadd 和 datepart 从 sybase 迁移到 oracle