spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Abhi Basu <9000r...@gmail.com>
Subject Re: sbt assembly with hive
Date Sat, 13 Dec 2014 03:18:19 GMT
I am getting the same message when trying to get HIveContext in CDH 5.1
after enabling Spark. I am thinking Spark should come with Hive enabled
(default option) as Hive metastore is a common way to share data, due to
popularity of Hive and other SQL-Over-Hadoop technologies like Impala.

Thanks,

Abhi

On Fri, Dec 12, 2014 at 6:40 PM, Stephen Boesch <javadba@gmail.com> wrote:
>
>
> What is the proper way to build with hive from sbt?  The SPARK_HIVE is
> deprecated. However after running the following:
>
>    sbt -Pyarn -Phadoop-2.3 -Phive  assembly/assembly
>
> And then
>   bin/pyspark
>
>    hivectx = HiveContext(sc)
>
>    hivectx.hiveql("select * from my_table")
>
> Exception: ("You must build Spark with Hive. Export 'SPARK_HIVE=true' and
> run sbt/sbt assembly", Py4JError(u'Trying to call a package.',))
>


-- 
Abhi Basu

Mime
View raw message