spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Serega Sheypak <serega.shey...@gmail.com>
Subject Re: how "hour" function in Spark SQL is supposed to work?
Date Tue, 20 Mar 2018 21:43:06 GMT
Hi, any updates? Looks like some API inconsistency or bug..?

2018-03-17 13:09 GMT+01:00 Serega Sheypak <serega.sheypak@gmail.com>:

> > Not sure why you are dividing by 1000. from_unixtime expects a long type
> It expects seconds, I have milliseconds.
>
>
>
> 2018-03-12 6:16 GMT+01:00 vermanurag <anurag.verma@fnmathlogic.com>:
>
>> Not sure why you are dividing by 1000. from_unixtime expects a long type
>> which is time in milliseconds from reference date.
>>
>> The following should work:
>>
>> val ds = dataset.withColumn("hour",hour(from_unixtime(dataset.col("
>> ts"))))
>>
>>
>>
>>
>>
>>
>> --
>> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>>
>>
>

Mime
View raw message