微信公众号搜"智元新知"关注
微信扫一扫可直接关注哦!

根据密度等级减去两行的列值

如何解决根据密度等级减去两行的列值

我有一个包含以下数据的数据框

srl_no      created_on              completed_on            prev_completed_on               time_from_last  Dense_Rank
XXXXXX1     2020-10-09T08:52:25     2020-10-09T08:57:45     null                            null            1
XXXXXX1     2020-10-09T09:04:32     2020-10-09T09:06:37     2020-10-09T08:57:45             407             2
XXXXXX1     2020-10-09T09:10:10     2020-10-09T09:12:17     2020-10-09T09:06:37             213             3
XXXXXX1     2020-10-09T09:10:10     2020-10-09T09:12:17     2020-10-09T09:12:17             -127            3

我想从prev_completed_on中减去created_on以得到time_from_last,但是由于最后两行具有相同的created_oncompleted_on消极的时间。在这种情况下,我需要从第二行中减去该值,即基于dense_rank列减去。

因此在上述情况下,我需要从第四行的completed_on的值中减去第二行的created_on的值。

以上代码

df = spark.createDataFrame(
    [
        ('XXXXXX1','2020-10-09T08:52:25','2020-10-09T08:57:45'),# create your data here,be consistent in the types.
        ('XXXXXX1','2020-10-09T09:04:32','2020-10-09T09:06:37'),('XXXXXX1','2020-10-09T09:10:10','2020-10-09T09:12:17'),],['srl_no','created_on','completed_on'] # add your columns label here
)

df = df.withColumn('created_on',f.col('created_on').cast(TimestampType()))
df = df.withColumn('created_on',f.col('created_on').cast(TimestampType()))


partition_cols = ["srl_no"]
window_clause = Window.partitionBy(partition_cols).orderBy(f.col('completed_on').asc())
# create the row number column
df1 = df.withColumn('prev_completed_on',f.lag(f.col("completed_on"))\
                                                       .over(window_clause).cast(TimestampType()))
df1 = df1.withColumn('dense_rank',f.dense_rank()\
                                                       .over(window_clause))
df1 = df1.withColumn("time_from_last",\
                             f.col("created_on").cast(LongType()) - col("prev_completed_on").cast(LongType()))

预期产量

srl_no      created_on              completed_on            prev_completed_on               time_from_last  Dense_Rank
XXXXXX1     2020-10-09T08:52:25     2020-10-09T08:57:45     null                            null            1
XXXXXX1     2020-10-09T09:04:32     2020-10-09T09:06:37     2020-10-09T08:57:45             407             2
XXXXXX1     2020-10-09T09:10:10     2020-10-09T09:12:17     2020-10-09T09:06:37             213             3
XXXXXX1     2020-10-09T09:10:10     2020-10-09T09:12:17     2020-10-09T09:12:17             **213**             3

解决方法

这里的技巧是使用groupby来获取每个srl_nodense_rank的最短日期。将其重新连接到准备好的数据框时,您将获得所需的结果。

df = spark.createDataFrame(
    [
        ('XXXXXX1','2020-10-09T08:52:25','2020-10-09T08:57:45'),# create your data here,be consistent in the types.
        ('XXXXXX1','2020-10-09T09:04:32','2020-10-09T09:06:37'),('XXXXXX1','2020-10-09T09:10:10','2020-10-09T09:12:17'),],['srl_no','created_on','completed_on'] # add your columns label here
)

df = df.withColumn('created_on',F.col('created_on').cast(T.TimestampType()))
df = df.withColumn('created_on',F.col('created_on').cast(T.TimestampType()))


partition_cols = ["srl_no"]
window_clause = Window.partitionBy(partition_cols).orderBy(F.col('completed_on').asc())

# create the row number column
df_with_rank = df.withColumn('prev_completed_on',F.lag(F.col("completed_on"))\
                                                       .over(window_clause).cast(T.TimestampType()))
df_with_rank = df_with_rank.withColumn('dense_rank',F.dense_rank()\
                                                       .over(window_clause))

dense_rank = df_with_rank.groupby("srl_no","dense_rank") \ 
                .agg(F.min('prev_completed_on').alias('prev_completed_on'))

df_with_rank = df_with_rank.drop('prev_completed_on')   
df_with_rank = df_with_rank.join(dense_rank,["srl_no","dense_rank"],'left')     

df_with_rank.show()    

输出:

+-------+----------+-------------------+-------------------+-------------------+
| srl_no|dense_rank|         created_on|       completed_on|  prev_completed_on|
+-------+----------+-------------------+-------------------+-------------------+
|XXXXXX1|         1|2020-10-09 08:52:25|2020-10-09T08:57:45|               null|
|XXXXXX1|         2|2020-10-09 09:04:32|2020-10-09T09:06:37|2020-10-09 08:57:45|
|XXXXXX1|         3|2020-10-09 09:10:10|2020-10-09T09:12:17|2020-10-09 09:06:37|
|XXXXXX1|         3|2020-10-09 09:10:10|2020-10-09T09:12:17|2020-10-09 09:06:37|
+-------+----------+-------------------+-------------------+-------------------+

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。