spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From patcharee <Patcharee.Thong...@uni.no>
Subject Re: Kryo serialization of classes in additional jars
Date Fri, 26 Jun 2015 15:35:45 GMT
Hi,

I am having this problem on spark 1.4. Do you have any ideas how to 
solve it? I tried to use spark.executor.extraClassPath, but it did not help

BR,
Patcharee

On 04. mai 2015 23:47, Imran Rashid wrote:
> Oh, this seems like a real pain.  You should file a jira, I didn't see 
> an open issue -- if nothing else just to document the issue.
>
> As you've noted, the problem is that the serializer is created 
> immediately in the executors, right when the SparkEnv is created, but 
> the other jars aren't downloaded later.  I think you could workaround 
> with some combination of pushing the jars to the cluster manually, and 
> then using spark.executor.extraClassPath
>
> On Wed, Apr 29, 2015 at 6:42 PM, Akshat Aranya <aaranya@gmail.com 
> <mailto:aaranya@gmail.com>> wrote:
>
>     Hi,
>
>     Is it possible to register kryo serialization for classes
>     contained in jars that are added with "spark.jars"?  In my
>     experiment it doesn't seem to work, likely because the class
>     registration happens before the jar is shipped to the executor and
>     added to the classloader.  Here's the general idea of what I want
>     to do:
>
>        val sparkConf = new SparkConf(true)
>           .set("spark.jars", "foo.jar")
>           .setAppName("foo")
>           .set("spark.serializer",
>     "org.apache.spark.serializer.KryoSerializer")
>
>     // register classes contained in foo.jar
>         sparkConf.registerKryoClasses(Array(
>           classOf[com.foo.Foo],
>           classOf[com.foo.Bar]))
>
>


Mime
View raw message