spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Romi Kuntsman <r...@totango.com>
Subject Re: Timestamp functions for sqlContext
Date Tue, 21 Jul 2015 16:15:24 GMT
Hi Tal,

I'm not sure there is currently a built-in function for it, but you can
easily define a UDF (user defined function) by extending
org.apache.spark.sql.api.java.UDF1, registering it
(sparkContext.udf().register(...)), and then use it inside your query.

RK.



On Tue, Jul 21, 2015 at 7:04 PM Tal Rozen <tal@scaleka.com> wrote:

> Hi,
>
> I'm running a query with sql context where one of the fields is of type
> java.sql.Timestamp. I'd like to set a function similar to DATEDIFF in
> mysql, between the date given in each row, and now. So If I was able to use
> the same syntax as in mysql it would be:
>
> val date_diff_df = sqlContext.sql("select DATEDIFF(curdate(),
> rowTimestamp) date_diff from tableName")
>
> What are the relevant key words to replace curdate(), and DATEDIFF?
>
> Thanks
>
>
>
>
>
>

Mime
View raw message