spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Gerard Maas <gerard.m...@gmail.com>
Subject Re: Modifying RDD.scala
Date Thu, 28 Nov 2013 11:29:58 GMT
Hi Zuhair,

Following the exception, you have two different versions somewhere. Do you
get the same behavior if you use it in a single node?
Maybe Spark veterans have more specific tips for you.

kr, Gerard.


On Wed, Nov 27, 2013 at 5:00 PM, Zuhair Khayyat <zuhair.khayyat@gmail.com>wrote:

> Dear Gerard,
>
> All servers share the spark binaries through NFS; It is unlikly that other
> servers contains the old class. I will test later with one server and see
> if I got the same problem..
>
> Regards,
> Zuhair Khayyat
>
> On Nov 27, 2013, at 6:29 PM, Gerard Maas <gerard.maas@gmail.com> wrote:
>
> > From the looks of your exception, you modified your local class, but you
> > forgot to deploy those local changes to the cluster. This error msg:
> > classdesc serialVersionUID = 5151096093324583655, local class
> > serialVersionUID = 9012954318378784201
> >
> > indicates that a version being de-serialized is different from the local
> > version. Make sure you deploy your changes across your Spark cluster.
> >
> > -kr, Gerard.
> >
> >
> > On Wed, Nov 27, 2013 at 4:22 PM, Zuhair Khayyat <
> zuhair.khayyat@gmail.com>wrote:
> >
> >> Dear SPARK members,
> >>
> >> I am trying to start developing on SPARK source code. I have added a new
> >> dummy function in RDD.scala to test if it compiles and runs. The
> modified
> >> Spark compiled correctly but when I execute my code I got the following
> >> error:
> >>
> >> java.io.InvalidClassException: spark.RDD; local class incompatible:
> stream
> >> classdesc serialVersionUID = 5151096093324583655, local class
> >> serialVersionUID = 9012954318378784201
> >>        at
> >> java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)
> >>        at
> >> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1620)
> >>        at
> >> java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1515)
> >>        at
> >> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1620)
> >>        at
> >> java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1515)
> >>        at
> >>
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1769)
> >>        at
> >> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1348)
> >>        at
> java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
> >>        at
> >> spark.JavaDeserializationStream.readObject(JavaSerializer.scala:23)
> >>        at
> >> spark.scheduler.ShuffleMapTask$.deserializeInfo(ShuffleMapTask.scala:54)
> >>        at
> >> spark.scheduler.ShuffleMapTask.readExternal(ShuffleMapTask.scala:111)
> >>        at
> >> java.io.ObjectInputStream.readExternalData(ObjectInputStream.java:1835)
> >>        at
> >>
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1794)
> >>        at
> >> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1348)
> >>        at
> java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
> >>        at
> >> spark.JavaDeserializationStream.readObject(JavaSerializer.scala:23)
> >>        at
> >> spark.JavaSerializerInstance.deserialize(JavaSerializer.scala:45)
> >>        at spark.executor.Executor$TaskRunner.run(Executor.scala:96)
> >>        at
> >>
> >>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> >>        at
> >>
> >>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> >>        at java.lang.Thread.run(Thread.java:724)
> >> 13/11/27 17:47:43 ERROR executor.StandaloneExecutorBackend: Driver or
> >> worker disconnected! Shutting down.
> >>
> >> Can you please help me to find out what went wrong? Thank you
> >>
> >> Zuhair Khayyat
> >>
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message