spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akhil Das <ak...@sigmoidanalytics.com>
Subject Re: Task not serializable
Date Fri, 05 Sep 2014 14:10:04 GMT
You can bring those classes out of the library and Serialize it (implements
Serializable). It is not the right way of doing it though it solved few of
my similar problems.

Thanks
Best Regards


On Fri, Sep 5, 2014 at 7:36 PM, Sarath Chandra <
sarathchandra.josyam@algofusiontech.com> wrote:

> Hi,
>
> I'm trying to migrate a map-reduce program to work with spark. I migrated
> the program from Java to Scala. The map-reduce program basically loads a
> HDFS file and for each line in the file it applies several transformation
> functions available in various external libraries.
>
> When I execute this over spark, it is throwing me "Task not serializable"
> exceptions for each and every class being used from these from external
> libraries. I included serialization to few classes which are in my scope,
> but there there are several other classes which are out of my scope like
> org.apache.hadoop.io.Text.
>
> How to overcome these exceptions?
>
> ~Sarath.
>

Mime
View raw message