spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ted Yu <yuzhih...@gmail.com>
Subject Re:
Date Wed, 14 Jan 2015 15:23:31 GMT
>From pom.xml (master branch):
      <dependency>
        <groupId>io.netty</groupId>
        <artifactId>netty-all</artifactId>
        <version>4.0.23.Final</version>
      </dependency>

Please check the version of netty Spark 1.1.1 depends on.

Cheers

On Wed, Jan 14, 2015 at 6:58 AM, Jianguo Li <flyingfromchina@gmail.com>
wrote:

> I am using Spark-1.1.1. When I used "sbt test", I ran into the following exceptions.
Any idea how to solve it? Thanks! I think somebody posted this question before, but no one
seemed to have answered it. Could it be the version of "io.netty" I put in my build.sbt? I
included an dependency "libraryDependencies += "io.netty" % "netty" % "3.6.6.Final" in my
build.sbt file.
>
> java.lang.NoClassDefFoundError: io/netty/util/TimerTask        at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:72)
>       at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:168)
>     at org.apache.spark.SparkEnv$.create(SparkEnv.scala:230)        at org.apache.spark.SparkContext.<init>(SparkContext.scala:204)
>       at spark.jobserver.util.DefaultSparkContextFactory.makeContext(SparkContextFactory.scala:34)
>       at spark.jobserver.JobManagerActor.createContextFromConfig(JobManagerActor.scala:255)
>       at spark.jobserver.JobManagerActor$$anonfun$wrappedReceive$1.applyOrElse(JobManagerActor.scala:104)
>       at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
>       at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
>       at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25).......
>
>

Mime
View raw message