spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From JF Chen <darou...@gmail.com>
Subject How to deal with context dependent computing?
Date Thu, 23 Aug 2018 02:52:27 GMT
For example, I have some data with timstamp marked as category A and B, and
ordered by time. Now I want to calculate each duration from A to B. In
normal program, I can use the  flag bit to record the preview data if it is
A or B, and then calculate the duration. But in Spark Dataframe, how to do
it?

Thanks!

Regard,
Junfeng Chen

Mime
View raw message