spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Aniket Bhatnagar <aniket.bhatna...@gmail.com>
Subject Re: ClosureCleaner should use ClassLoader created by SparkContext
Date Wed, 21 Jan 2015 12:26:43 GMT
Here is the stack trace for reference. Notice that this happens in when the
job spawns a new thread.

java.lang.ClassNotFoundException: com.myclass$$anonfun$8$$anonfun$9
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
~[na:1.7.0_71]
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
~[na:1.7.0_71]
        at java.security.AccessController.doPrivileged(Native Method)
~[na:1.7.0_71]
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
~[na:1.7.0_71]
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
~[na:1.7.0_71]
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
~[na:1.7.0_71]
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
~[na:1.7.0_71]
        at java.lang.Class.forName0(Native Method) ~[na:1.7.0_71]
        at java.lang.Class.forName(Class.java:274) ~[na:1.7.0_71]
        at
org.apache.spark.util.InnerClosureFinder$$anon$4.visitMethodInsn(ClosureCleaner.scala:260)
~[org.apache.spark.spark-core_2.11-1.2.0.jar:1.2.0]
        at
com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
Source) ~[com.esotericsoftware.reflectasm.reflectasm-1.07-shaded.jar:na]
        at
com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
Source) ~[com.esotericsoftware.reflectasm.reflectasm-1.07-shaded.jar:na]
        at
org.apache.spark.util.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:87)
~[org.apache.spark.spark-core_2.11-1.2.0.jar:1.2.0]
        at
org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:107)
~[org.apache.spark.spark-core_2.11-1.2.0.jar:1.2.0]
        at org.apache.spark.SparkContext.clean(SparkContext.scala:1435)
~[org.apache.spark.spark-core_2.11-1.2.0.jar:1.2.0]
        at org.apache.spark.rdd.RDD.map(RDD.scala:271)
~[org.apache.spark.spark-core_2.11-1.2.0.jar:1.2.0]
        at com.myclass.com$myclass$$load(myclass.scala:375) ~[na:na]
        at
scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
~[org.scala-lang.scala-library-2.11.5.jar:na]
        at
scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
~[org.scala-lang.scala-library-2.11.5.jar:na]
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
[na:1.7.0_71]
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
[na:1.7.0_71]
        at java.lang.Thread.run(Thread.java:745) [na:1.7.0_71]


On Wed Jan 21 2015 at 17:34:34 Aniket Bhatnagar <aniket.bhatnagar@gmail.com>
wrote:

> While implementing a spark server, I realized that Thread's context loader
> must be set to any dynamically loaded classloader so that ClosureCleaner
> can do it's thing. Should the ClosureCleaner not use classloader created by
> SparkContext (that has all dynamically added jars via SparkContext.addJar)
> instead of using Thread.currentThread.getContextClassLoader while looking
> up class in InnerClosureFinder?
>
> Thanks,
> Aniket
>

Mime
View raw message