spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Anton Brazhnyk <anton.brazh...@genesys.com>
Subject RE: Spark's Guava pieces cause exceptions in non-trivial deployments
Date Fri, 15 May 2015 02:38:16 GMT
The problem is with 1.3.1
It has Function class (mentioned in exception) in spark-network-common_2.10-1.3.1.jar.
Our current resolution is actually backport to 1.2.2, which is working fine.


From: Marcelo Vanzin [mailto:vanzin@cloudera.com]
Sent: Thursday, May 14, 2015 6:27 PM
To: Anton Brazhnyk
Cc: user@spark.apache.org
Subject: Re: Spark's Guava pieces cause exceptions in non-trivial deployments

What version of Spark are you using?
The bug you mention is only about the Optional class (and a handful of others, but none of
the classes you're having problems with). All other Guava classes should be shaded since Spark
1.2, so you should be able to use your own version of Guava with no problems (aside from the
Optional classes).
Also, Spark 1.3 added some improvements to how shading is done, so if you're using 1.2 I'd
recommend trying 1.3 before declaring defeat.

On Thu, May 14, 2015 at 4:52 PM, Anton Brazhnyk <anton.brazhnyk@genesys.com<mailto:anton.brazhnyk@genesys.com>>
wrote:
Greetings,

I have a relatively complex application with Spark, Jetty and Guava (16) not fitting together.
Exception happens when some components try to use “mix” of Guava classes (including Spark’s
pieces) that are loaded by different classloaders:
java.lang.LinkageError: loader constraint violation: when resolving method "com.google.common.collect.Iterables.transform(Ljava/lang/Iterable;Lcom/google/common/base/Function;)Ljava/lang/Iterable;"
the class loader (instance of org/eclipse/jetty/webapp/WebAppClassLoader) of the current class,
org/apache/cassandra/db/ColumnFamilyStore, and the class loader (instance of java/net/URLClassLoader)
for resolved class, com/google/common/collect/Iterables, have different Class objects for
the type e;Lcom/google/common/base/Function;)Ljava/lang/Iterable; used in the signature

According to https://issues.apache.org/jira/browse/SPARK-4819 it’s not going to be fixed
at least until Spark 2.0, but maybe some workaround is possible?
Those classes are pretty simple and have low chances to be changed in Guava significantly,
so any “external” Guava can provide them.

So, could such problems be fixed if those Spark’s pieces of Guava would be in separate jar
and could be excluded from the mix (substituted by “external” Guava)?


Thanks,
Anton



--
Marcelo
Mime
View raw message