spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jacek Laskowski <ja...@japila.pl>
Subject Re: Why Spark 2.2.1 still bundles old Hive jars?
Date Mon, 11 Dec 2017 12:02:51 GMT
Hi,

https://issues.apache.org/jira/browse/SPARK-19076

Pozdrawiam,
Jacek Laskowski
----
https://about.me/JacekLaskowski
Spark Structured Streaming https://bit.ly/spark-structured-streaming
Mastering Apache Spark 2 https://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski

On Mon, Dec 11, 2017 at 7:43 AM, An Qin <aqin@qilinsoft.com> wrote:

> Hi, all,
>
>
>
> I want to include Sentry 2.0.0 in my Spark project. However it bundles
> Hive 2.3.2. I find the newest Spark 2.2.1 still bundles old Hive jars, for
> example, hive-exec-1.2.1.spark2.jar. Why does it upgrade to the new Hive?
> Are they compatible?
>
>
>
> Regards,
>
>
>
>
>
> Qin An.
>
>
>
>
>
>
>

Mime
View raw message