spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Koert Kuipers <ko...@tresata.com>
Subject Re: Spark 2 and existing code with sqlContext
Date Sat, 13 Aug 2016 04:16:32 GMT
you can get it from the SparkSession for backwards compatibility:
val sqlContext = spark.sqlContext

On Mon, Aug 8, 2016 at 9:11 AM, Mich Talebzadeh <mich.talebzadeh@gmail.com>
wrote:

> Hi,
>
> In Spark 1.6.1 this worked
>
> scala> sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(), 'dd/MM/yyyy
> HH:mm:ss.ss') ").collect.foreach(println)
> [08/08/2016 14:07:22.22]
>
> Spark 2 should give due to backward compatibility?
>
> But I get
>
> cala> sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(), 'dd/MM/yyyy
> HH:mm:ss.ss') ").collect.foreach(println)
> <console>:24: error: not found: value sqlContext
>        sqlContext.sql("SELECT FROM_unixtime(unix_timestamp(), 'dd/MM/yyyy
> HH:mm:ss.ss') ").collect.foreach(println)
>
> Now we can change it to HiveContext and it works
>
> However, what is the best solution if any as we have loads of sqlContext
> in our code?
>
> Thanks
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>
>
>
> http://talebzadehmich.wordpress.com
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>

Mime
View raw message