See if this link helps - 

Also, try extending the class and make it Serializable(your new child class) if you can not get the source locally!!


On Fri, Sep 5, 2014 at 7:51 PM, Akhil Das <akhil@sigmoidanalytics.com> wrote:
Get the class locally and Serialize it.  http://grepcode.com/file_/repository.cloudera.com/content/repositories/releases/com.cloudera.hadoop/hadoop-core/0.20.2-737/org/apache/hadoop/io/Text.java/?v=source

Inline image 1

PS: Some classes may require additional classes to get serialized. Hopefully there should be some other way doing it.

Best Regards

On Fri, Sep 5, 2014 at 7:45 PM, Sarath Chandra <sarathchandra.josyam@algofusiontech.com> wrote:
Hi Akhil,

I've done this for the classes which are in my scope. But what to do with classes that are out of my scope? 
For example org.apache.hadoop.io.Text

Also I'm using several 3rd part libraries like "jeval".


On Fri, Sep 5, 2014 at 7:40 PM, Akhil Das <akhil@sigmoidanalytics.com> wrote:
You can bring those classes out of the library and Serialize it (implements Serializable). It is not the right way of doing it though it solved few of my similar problems.

Best Regards

On Fri, Sep 5, 2014 at 7:36 PM, Sarath Chandra <sarathchandra.josyam@algofusiontech.com> wrote:

I'm trying to migrate a map-reduce program to work with spark. I migrated the program from Java to Scala. The map-reduce program basically loads a HDFS file and for each line in the file it applies several transformation functions available in various external libraries.

When I execute this over spark, it is throwing me "Task not serializable" exceptions for each and every class being used from these from external libraries. I included serialization to few classes which are in my scope, but there there are several other classes which are out of my scope like org.apache.hadoop.io.Text.

How to overcome these exceptions?


Alok Kumar
Email : alokawi@gmail.com