spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Michael Armbrust <mich...@databricks.com>
Subject Re: Is there any document to explain how to build the hive jars for spark?
Date Sun, 14 Dec 2014 19:36:26 GMT
The modified version of hive can be found here:
https://github.com/pwendell/hive

On Thu, Dec 11, 2014 at 5:47 PM, Yi Tian <tianyi.asiainfo@gmail.com> wrote:
>
> Hi, all
>
> We found some bugs in hive-0.12, but we could not wait for hive community
> fixing them.
>
> We want to fix these bugs in our lab and build a new release which could
> be recognized by spark.
>
> As we know, spark depends on a special release of hive, like:
>
> |<dependency>
>   <groupId>org.spark-project.hive</groupId>
>   <artifactId>hive-metastore</artifactId>
>   <version>${hive.version}</version>
> </dependency>
> |
>
> The different between |org.spark-project.hive| and |org.apache.hive| was
> described by Patrick:
>
> |There are two differences:
>
> 1. We publish hive with a shaded protobuf dependency to avoid
> conflicts with some Hadoop versions.
> 2. We publish a proper hive-exec jar that only includes hive packages.
> The upstream version of hive-exec bundles a bunch of other random
> dependencies in it which makes it really hard for third-party projects
> to use it.
> |
>
> Is there any document to guide us how to build the hive jars for spark?
>
> Any help would be greatly appreciated.
>
> ‚Äč
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message