spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Yana Kadiyska <yana.kadiy...@gmail.com>
Subject ClassNotFound exception from closure
Date Wed, 17 Jun 2015 01:07:08 GMT
Hi folks,

running into a pretty strange issue -- I have a ClassNotFound exception
from a closure?! My code looks like this:

 val jRdd1 = table.map(cassRow=>{
      val lst = List(cassRow.get[Option[Any]](0),cassRow.get[Option[Any]](1))
      Row.fromSeq(lst)
    })
    println(s"This one worked ..."+jRdd1.first.toString())

    println("SILLY -----------------------------------")
    val sillyRDD=sc.parallelize(1 to 100)
    val jRdd2 = sillyRDD.map(value=>{
      val cols = (0 to 2).map(i=>"foo").toList //3 foos per row
      println(s"Valus "+cols.mkString("|"))
      Row.fromSeq(cols)
    })
    println(s"This one worked too "+jRdd2.first.toString())

​
and the exception I see goes:

This one worked ...[Some(1234),Some(1434123162)]
SILLY -----------------------------------
Exception in thread "main" java.lang.ClassNotFoundException:
HardSparkJob$anonfun$3$anonfun$4
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:270)
        at org.apache.spark.util.InnerClosureFinder$anon$4.visitMethodInsn(ClosureCleaner.scala:455)
        at com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
Source)
        at com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.accept(Unknown
Source)
        at org.apache.spark.util.ClosureCleaner$.getInnerClosureClasses(ClosureCleaner.scala:101)
        at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$clean(ClosureCleaner.scala:197)
        at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:132)
        at org.apache.spark.SparkContext.clean(SparkContext.scala:1891)
        at org.apache.spark.rdd.RDD$anonfun$map$1.apply(RDD.scala:294)
        at org.apache.spark.rdd.RDD$anonfun$map$1.apply(RDD.scala:293)
        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
        at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
        at org.apache.spark.rdd.RDD.map(RDD.scala:293)
        at HardSparkJob$.testUnionViaRDD(SparkTest.scala:127)
        at HardSparkJob$.main(SparkTest.scala:104)
        at HardSparkJob.main(SparkTest.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$runMain(SparkSubmit.scala:664)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

​

I don't quite know what to make of this error. The stacktrace shows a
problem with my code at sillyRDD.map(SparkTest.scala:127)

I'm running Spark 1.4 CDH prebuilt with

bin/spark-submit --class HardSparkJob --master mesos://$MESOS_MASTER
../MyJar.jar

Any insight much appreciated

Mime
View raw message