spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akhil Das <ak...@sigmoidanalytics.com>
Subject Re: Spark 1.5.0 Not able to submit jobs using cluster URL
Date Mon, 28 Sep 2015 09:26:29 GMT
Well, for some reason your test is picking up the older jar then. Best way
to sort it out would be to create a build file for your project and add the
dependencies in the build file rather than you manually putting the jars.

Thanks
Best Regards

On Mon, Sep 28, 2015 at 2:44 PM, Lokesh Kumar Padhnavis <lokesh@dataken.net>
wrote:

> Thanks Akhil for the reply,
>
> I am using ant and I placed the latest 1.5.0 jar file in my code and I am
> actually testing this on my laptop, so I have got only two places to change
> one in spark itself and the other in my code.
> And I did that.
>
> On Mon, Sep 28, 2015 at 2:30 PM Akhil Das <akhil@sigmoidanalytics.com>
> wrote:
>
>> Update the dependency version in your jobs build file, Also make sure you
>> have updated the spark version to 1.5.0 everywhere. (in the cluster, code)
>>
>> Thanks
>> Best Regards
>>
>> On Mon, Sep 28, 2015 at 11:29 AM, lokeshkumar <lokesh@dataken.net> wrote:
>>
>>> Hi forum
>>>
>>> I have just upgraded spark from 1.4.0 to 1.5.0 and am running my old
>>> (1.4.0)
>>> jobs on 1.5.0 using 'spark://ip:7077' cluster URL. But the job does not
>>> seem
>>> to start and errors out at server with below incompatible class exception
>>>
>>> 15/09/28 11:20:07 INFO Master: 10.0.0.195:34702 got disassociated,
>>> removing
>>> it.
>>> 15/09/28 11:20:07 WARN ReliableDeliverySupervisor: Association with
>>> remote
>>> system [akka.tcp://sparkDriver@10.0.0.195:34702] has failed, address is
>>> now
>>> gated for [5000] ms. Reason:
>>> [org.apache.spark.deploy.DeployMessages$RegisterApplication; local class
>>> incompatible: stream classdesc serialVersionUID = 352674063933172066,
>>> local
>>> class serialVersionUID = -5495080032843259921]
>>> 15/09/28 11:20:27 ERROR ErrorMonitor:
>>> org.apache.spark.deploy.DeployMessages$RegisterApplication; local class
>>> incompatible: stream classdesc serialVersionUID = 352674063933172066,
>>> local
>>> class serialVersionUID = -5495080032843259921
>>> java.io.InvalidClassException:
>>> org.apache.spark.deploy.DeployMessages$RegisterApplication; local class
>>> incompatible: stream classdesc serialVersionUID = 352674063933172066,
>>> local
>>> class serialVersionUID = -5495080032843259921
>>>         at
>>> java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:621)
>>>         at
>>> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1623)
>>>         at
>>> java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518)
>>>         at
>>> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774)
>>>         at
>>> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
>>>         at
>>> java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
>>>         at
>>> akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)
>>>         at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
>>>         at
>>> akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)
>>>         at
>>>
>>> akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
>>>         at scala.util.Try$.apply(Try.scala:161)
>>>         at
>>> akka.serialization.Serialization.deserialize(Serialization.scala:98)
>>>         at
>>>
>>> akka.remote.serialization.MessageContainerSerializer.fromBinary(MessageContainerSerializer.scala:63)
>>>         at
>>>
>>> akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
>>>         at scala.util.Try$.apply(Try.scala:161)
>>>         at
>>> akka.serialization.Serialization.deserialize(Serialization.scala:98)
>>>         at
>>> akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)
>>>         at
>>>
>>> akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58)
>>>         at
>>> akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58)
>>>         at
>>> akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)
>>>         at
>>>
>>> akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:935)
>>>         at akka.actor.Actor$class.aroundReceive(Actor.scala:467)
>>>         at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:411)
>>>         at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
>>>         at akka.actor.ActorCell.invoke(ActorCell.scala:487)
>>>         at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
>>>         at akka.dispatch.Mailbox.run(Mailbox.scala:220)
>>>         at
>>>
>>> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
>>>         at
>>> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>>>         at
>>>
>>> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>>>         at
>>> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>>>         at
>>>
>>> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>>>
>>>
>>> The error at client is 'Association with remote system
>>> [akka.tcp://sparkMaster@lokesh-lt:7077] has failed, address is now
>>> gated for
>>> [5000] ms. Reason is: [Disassociated].'
>>>
>>> Please let me know if I am doing anything wrong,
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-5-0-Not-able-to-submit-jobs-using-cluster-URL-tp24835.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>> For additional commands, e-mail: user-help@spark.apache.org
>>>
>>>

Mime
View raw message