spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Rishabh Bhardwaj <rbnex...@gmail.com>
Subject Re: write and call UDF in spark dataframe
Date Wed, 20 Jul 2016 11:22:31 GMT
Hi Divya,

There is already "from_unixtime" exists in org.apache.spark.sql.frunctions,
Rabin has used that in the sql query,if you want to use it in dataframe DSL
you can try like this,

val new_df = df.select(from_unixtime($"time").as("newtime"))


Thanks,
Rishabh.

On Wed, Jul 20, 2016 at 4:21 PM, Rabin Banerjee <
dev.rabin.banerjee@gmail.com> wrote:

> Hi Divya ,
>
> Try,
>
> val df = sqlContext.sql("select from_unixtime(ts,'YYYY-MM-dd') as `ts` from mr")
>
> Regards,
> Rabin
>
> On Wed, Jul 20, 2016 at 12:44 PM, Divya Gehlot <divya.htconex@gmail.com>
> wrote:
>
>> Hi,
>> Could somebody share example of writing and calling udf which converts
>> unix tme stamp to date tiime .
>>
>>
>> Thanks,
>> Divya
>>
>
>

Mime
View raw message