spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Daniel Duckworth <...@premise.com>
Subject Dependency Versions
Date Fri, 16 Aug 2013 02:02:37 GMT
*tl;dr How does Spark decide precedence when loading JARs? Is there a way
to force Spark executors to draw all dependencies from my fat JAR?*

Hello Spark users,

Recently, I've written a Spark task that makes use of *finagle* for making
RPC calls.  The version of *finagle* I'm using (6.5.0) makes use of
*netty* 3.5.12.Final,
while *spark* 0.7.0 uses* netty* 3.5.3. I've recently toyed with my Maven
POM to prefer *netty* 3.5.12 and can successfully run the task with
`MASTER=local`, `MASTER=local[8]` and `MASTER=spark://127.0.0.1:7077` (when
I start up a Spark master and worker locally), but my attempts to run
remotely on EC2 with a standalone Spark cluster have resulted in the
following *NoSuchMethod* exceptions on initializing my *finagle* client,

INFO  [2013-08-16T00:58:13.572] [spark-akka.actor.default-dispatcher-3:12]
spark.scheduler.cluster.TaskSetManager: Loss was due to
java.lang.NoSuchMethodError:
org.jboss.netty.channel.socket.nio.NioWorkerPool.<init>(Ljava/util/concurrent/
Executor;I)V
        at
com.twitter.finagle.netty3.WorkerPool$.<init>(WorkerPool.scala:12)
        at com.twitter.finagle.netty3.WorkerPool$.<clinit>(WorkerPool.scala)
        at
com.twitter.finagle.netty3.Netty3Transporter$$anon$1.<init>(client.scala:225)
        at
com.twitter.finagle.netty3.Netty3Transporter$.<init>(client.scala:224)
        at
com.twitter.finagle.netty3.Netty3Transporter$.<clinit>(client.scala)

The remote Spark server and workers are using *netty* 3.5.3, so I suspect
that their version of *netty* is overshadowing the one submitted in my fat
JAR.

Is there a way I can force Spark executors to draw dependencies from my fat
JAR, rather than what they already have?

Thanks!

- Daniel Duckworth

Mime
View raw message