spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sean Owen <so...@cloudera.com>
Subject Re: Getting exception on JavaSchemaRDD; org.apache.spark.SparkException: Task not serializable
Date Sat, 22 Nov 2014 18:11:07 GMT
You are declaring an anonymous inner class here. It has a reference to the
containing class even if you don't use it. If the closure cleaner can't
determine it isn't used, this reference will cause everything in the outer
class to serialize. Try rewriting this as a named static inner class .
On Nov 22, 2014 5:23 PM, "vdiwakar.malladi" <vdiwakar.malladi@gmail.com>
wrote:

> Thanks for your prompt response.
>
> I'm not using any thing in my map function. please see the below code. For
> sample purpose, I would like to using 'select * from
> '.
>
> This code worked for me in standalone mode. But when I integrated with my
> web application, it is throwing the specified exception.
>
> List<String> sdo = sdoData.map(new Function<Row, String>() {
>
>         public String call(Row row) {
>                 //return row.getString(0);
>                 return null;
>         }
> }).collect();
>
> Thanks in advance.
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Getting-exception-while-calling-map-method-on-JavaSchemaRDD-org-apache-spark-SparkException-Task-note-tp19558p19564.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Mime
View raw message