spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "S. Zhou" <myx...@yahoo.com.INVALID>
Subject Issue when upgrading from Spark 1.1.0 to 1.1.1: Exception of java.lang.NoClassDefFoundError: io/netty/util/TimerTask
Date Wed, 10 Dec 2014 18:45:04 GMT
Everything worked fine on Spark 1.1.0 until we upgrade to 1.1.1. For some of our unit tests
we saw the following exceptions. Any idea how to solve it? Thanks!
java.lang.NoClassDefFoundError: io/netty/util/TimerTask        at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:72) 
      at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:168)   
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:230)        at org.apache.spark.SparkContext.<init>(SparkContext.scala:204) 
      at spark.jobserver.util.DefaultSparkContextFactory.makeContext(SparkContextFactory.scala:34) 
      at spark.jobserver.JobManagerActor.createContextFromConfig(JobManagerActor.scala:255) 
      at spark.jobserver.JobManagerActor$$anonfun$wrappedReceive$1.applyOrElse(JobManagerActor.scala:104) 
      at scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33) 
      at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33) 
      at scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25).......


Mime
View raw message