spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Gerard Maas <>
Subject Re: ClassNotFoundException with Spark/Mesos (spark-shell works fine)
Date Wed, 21 May 2014 19:30:09 GMT
Hi Tobias,

On Wed, May 21, 2014 at 5:45 PM, Tobias Pfeiffer <> wrote:
>first, thanks for your explanations regarding the jar files!
No prob :-)

> On Thu, May 22, 2014 at 12:32 AM, Gerard Maas <>
> wrote:
> > I was discussing it with my fellow Sparkers here and I totally overlooked
> > the fact that you need the class files to de-serialize the closures (or
> > whatever) on the workers, so you always need the jar file delivered to
> the
> > workers in order for it to work.
> So the closure as a function is serialized, sent across the wire,
> deserialized there, and *still* you need the class files? (I am not
> sure I understand what is actually sent over the network then. Does
> that serialization only contain the values that I close over?)

I also had that mental lapse. Serialization refers to converting object
(not class) state (current values)  into a byte stream and de-serialization
restores the bytes from the wire into an seemingly identical object at the
receiving side (except for transient variables), for that, it requires the
class definition of that object to know what it needs to instantiate, so
yes, the compiled classes need to be given to the Spark driver and it will
take care of dispatching them to the workers (much better than in the old
RMI days ;-)

> If I understand correctly what you are saying, then the documentation
> at
> (list item 8) needs to be extended quite a bit, right?

The mesos docs have been recently updated here:
Don't know where the latest version from master is built/available.

-kr, Gerard.

View raw message