本文介绍了Apache Spark从时间戳列中减去天数的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用Spark数据集,无法从时间戳列中减去天数.

I am using Spark Dataset and having trouble subtracting days from a timestamp column.

我想从时间戳列中减去天数,并获取具有完整日期时间格式的新列.示例:

I would like to subtract days from Timestamp Column and get new Column with full datetime format. Example:

2017-09-22 13:17:39.900 - 10 ----> 2017-09-12 13:17:39.900

有了date_sub函数,我将获得2017-09-12而没有13:17:39.900.

With date_sub functions I am getting 2017-09-12 without 13:17:39.900.

推荐答案

cast将数据timestampexpr减去INTERVAL:

import org.apache.spark.sql.functions.expr

val df = Seq("2017-09-22 13:17:39.900").toDF("timestamp")

df.withColumn(
  "10_days_before", 
  $"timestamp".cast("timestamp") - expr("INTERVAL 10 DAYS")).show(false)
+-----------------------+---------------------+
|timestamp              |10_days_before       |
+-----------------------+---------------------+
|2017-09-22 13:17:39.900|2017-09-12 13:17:39.9|
+-----------------------+---------------------+

如果数据已经是TimestampType,则可以跳过cast.

If data is already of TimestampType you can skip cast.

这篇关于Apache Spark从时间戳列中减去天数的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

10-11 10:23