spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Tobias Pfeiffer <>
Subject Re:
Date Thu, 15 Jan 2015 01:39:29 GMT

On Thu, Jan 15, 2015 at 12:23 AM, Ted Yu <> wrote:
> On Wed, Jan 14, 2015 at 6:58 AM, Jianguo Li <>
> wrote:
>> I am using Spark-1.1.1. When I used "sbt test", I ran into the following exceptions.
Any idea how to solve it? Thanks! I think somebody posted this question before, but no one
seemed to have answered it. Could it be the version of "io.netty" I put in my build.sbt? I
included an dependency "libraryDependencies += "io.netty" % "netty" % "3.6.6.Final" in my
build.sbt file.
>> From my personal experience, netty dependencies are very painful to get
right with Spark. I recommend to look at the dependency tree using <> and then fine-tune your
sbt ignores until it works. There are too many issues depending on what
other packages you use to give a general advice, I'm afraid.

And once you have them right and use `sbt assembly` to build your
application jar and want to run it on a cluster with spark-submit, you'll
find that the netty version bundled with Spark will be put on the classpath
before the version you want to use. It seems like there are various  Spark
configuration options to change this,
and a unification process is running, I think:

I'm also looking forward to this one, as I am stuck with an ancient version
of Finagle due to these Netty issues.

Good luck,

View raw message