spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Artemis User <arte...@dtechspace.com>
Subject Re: Spark hive build and connectivity
Date Thu, 22 Oct 2020 17:31:10 GMT
By default Spark will build with Hive 2.3.7, according to the Spark 
build doc.  If you want to replace it with a different hive jar, you 
need to change the Maven pom.xml file.

-- ND

On 10/22/20 11:35 AM, Ravi Shankar wrote:
> Hello all,
> I am trying to understand how the Spark SQL integration with hive 
> works. Whenever i build spark with -Phive -P hive-thriftserver 
> options, i see that it is packaged with hive-2.3.7*.jars and 
> spark-hive*.jars. And the documentation claims that spark can talk to 
> different versions of hive. If that is the case , what should i do if 
> i have a hive 3.2.1 running on my instance and i want my spark 
> application to talk to that hive cluster.
>
> Does this mean i have to build spark with hive version 3.2.1 or like 
> the documentation states, is it enough if i just add the metastore 
> jars to spark-defaults.conf ?
>
> Should i add my hive 3.2.1 lib to the SPARK_DIST_CLASSPATH as well ? 
> Will there be conflicts between the hive 2.3.7 jars and the hive 3.2.1 
> jars i will have in this case ?
>
>
> Thanks !

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message