spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Steve Loughran <ste...@hortonworks.com>
Subject Re: Spark's Guava pieces cause exceptions in non-trivial deployments
Date Mon, 18 May 2015 10:05:29 GMT

On 16 May 2015, at 04:39, Anton Brazhnyk <anton.brazhnyk@genesys.com<mailto:anton.brazhnyk@genesys.com>>
wrote:

For me it wouldn’t help I guess, because those newer classes would still be loaded by different
classloader.
What did work for me with 1.3.1 – removing of those classes from Spark’s jar completely,
so they get loaded from external Guava (the version I prefer) and by the classloader I expect.


Note that Hadoop <= 2.6.0 wont' work with Guava >= 0.17; see: HADOOP-11032

FWIW Guava is a version nightmare across the hadoop stack; almost as bad as protobuf.jar.
With Hadoop 2.7+, Hadoop will run on later versions, it'll just continue to ship an older
one to avoid breaking apps that expect it.
Mime
View raw message