spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Alok Kumar <alok...@gmail.com>
Subject Re: Task not serializable
Date Fri, 05 Sep 2014 19:56:11 GMT
Hi,

See if this link helps -
http://stackoverflow.com/questions/22592811/scala-spark-task-not-serializable-java-io-notserializableexceptionon-when

Also, try extending the class and make it Serializable(your new child
class) if you can not get the source locally!!

Thanks
Alok

On Fri, Sep 5, 2014 at 7:51 PM, Akhil Das <akhil@sigmoidanalytics.com>
wrote:

> Get the class locally and Serialize it.
> http://grepcode.com/file_/repository.cloudera.com/content/repositories/releases/com.cloudera.hadoop/hadoop-core/0.20.2-737/
> *org/apache/hadoop/io/Text.java*/?v=source
>
> [image: Inline image 1]
>
> PS: Some classes may require additional classes to get serialized.
> Hopefully there should be some other way doing it.
>
>
> Thanks
> Best Regards
>
>
> On Fri, Sep 5, 2014 at 7:45 PM, Sarath Chandra <
> sarathchandra.josyam@algofusiontech.com> wrote:
>
>> Hi Akhil,
>>
>> I've done this for the classes which are in my scope. But what to do with
>> classes that are out of my scope?
>> For example org.apache.hadoop.io.Text
>>
>> Also I'm using several 3rd part libraries like "jeval".
>>
>> ~Sarath
>>
>>
>> On Fri, Sep 5, 2014 at 7:40 PM, Akhil Das <akhil@sigmoidanalytics.com>
>> wrote:
>>
>>> You can bring those classes out of the library and Serialize it
>>> (implements Serializable). It is not the right way of doing it though it
>>> solved few of my similar problems.
>>>
>>> Thanks
>>> Best Regards
>>>
>>>
>>> On Fri, Sep 5, 2014 at 7:36 PM, Sarath Chandra <
>>> sarathchandra.josyam@algofusiontech.com> wrote:
>>>
>>>> Hi,
>>>>
>>>> I'm trying to migrate a map-reduce program to work with spark. I
>>>> migrated the program from Java to Scala. The map-reduce program basically
>>>> loads a HDFS file and for each line in the file it applies several
>>>> transformation functions available in various external libraries.
>>>>
>>>> When I execute this over spark, it is throwing me "Task not
>>>> serializable" exceptions for each and every class being used from these
>>>> from external libraries. I included serialization to few classes which are
>>>> in my scope, but there there are several other classes which are out of my
>>>> scope like org.apache.hadoop.io.Text.
>>>>
>>>> How to overcome these exceptions?
>>>>
>>>> ~Sarath.
>>>>
>>>
>>>
>>
>


-- 
Alok Kumar
Email : alokawi@gmail.com
http://sharepointorange.blogspot.in/
http://www.linkedin.com/in/alokawi

Mime
View raw message