spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Martin Weindel <martin.wein...@gmail.com>
Subject Re: java.lang.AbstractMethodError
Date Thu, 03 Oct 2013 20:38:49 GMT
Hi Eduardo,

if you are using Spark 0.7.3, I remember that I had to replace the class 
file additional at
spark-0.7.3/core/target/scala-2.9.3/classes/spark/api/java/function/

Martin

Am 03.10.2013 22:35, schrieb Eduardo Berrocal:
> Hi Martin,
>
> Yes, that is what is seems. However, it is unlikely that is the case, 
> because I have all spark classes on my home, which is mounted on NFS 
> to all nodes. Unless there is something else I am missing...
>
> Edu
>
>
> On Thu, Oct 3, 2013 at 3:29 PM, Martin Weindel 
> <martin.weindel@gmail.com <mailto:martin.weindel@gmail.com>> wrote:
>
>     Hi Eduardo,
>
>     it seems to me that your second problem is caused by inconsistent,
>     i.e. different classes in master and worker JVMs.
>     Are you sure, that you have replaced the changed FlatMapFunction
>     on all worker nodes and also on master?
>
>     Regards,
>     Martin
>
>>     13/10/03 13:27:44 INFO cluster.ClusterTaskSetManager: Lost TID 0 (task
>>     1.0:0)
>>     13/10/03 13:27:44 INFO cluster.ClusterTaskSetManager: Loss was due to
>>     java.io.InvalidClassException
>>     java.io.InvalidClassException:
>>     org.apache.spark.api.java.function.FlatMapFunction; local class
>>     incompatible: stream classdesc serialVersionUID = -1748278142466443391,
>>     local class serialVersionUID = 2220150375729402137
>
>


Mime
View raw message