spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Serega Sheypak <serega.shey...@gmail.com>
Subject Re: how "hour" function in Spark SQL is supposed to work?
Date Tue, 20 Mar 2018 22:14:28 GMT
Ok, this one works:

.withColumn("hour", hour(from_unixtime(typedDataset.col("ts") / 1000)))



2018-03-20 22:43 GMT+01:00 Serega Sheypak <serega.sheypak@gmail.com>:

> Hi, any updates? Looks like some API inconsistency or bug..?
>
> 2018-03-17 13:09 GMT+01:00 Serega Sheypak <serega.sheypak@gmail.com>:
>
>> > Not sure why you are dividing by 1000. from_unixtime expects a long type
>> It expects seconds, I have milliseconds.
>>
>>
>>
>> 2018-03-12 6:16 GMT+01:00 vermanurag <anurag.verma@fnmathlogic.com>:
>>
>>> Not sure why you are dividing by 1000. from_unixtime expects a long type
>>> which is time in milliseconds from reference date.
>>>
>>> The following should work:
>>>
>>> val ds = dataset.withColumn("hour",hour(from_unixtime(dataset.col("ts
>>> "))))
>>>
>>>
>>>
>>>
>>>
>>>
>>> --
>>> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>>>
>>>
>>
>

Mime
View raw message