spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Mihai Iacob" <mia...@ca.ibm.com>
Subject spark 2.2.1
Date Thu, 01 Feb 2018 16:30:36 GMT
<div class="socmaildefaultfont" dir="ltr" style="font-family:Arial, Helvetica, sans-serif;font-size:10.5pt"
><div dir="ltr" ><div class="p1" >I am setting up a spark 2.2.1 cluster, however,
when I bring up the master and workers (both on spark 2.2.1) I get this error. I tried spark
2.2.0 and get the same error. It works fine on spark 2.0.2. Have you seen this before, any
idea what's wrong?</div>
<div class="p1" >&nbsp;</div>
<div class="p1" >I found this, but it's in a different situation:&nbsp;<a href="https://github.com/apache/spark/pull/19802"
>https://github.com/apache/spark/pull/19802</a></div>
<div class="p1" >&nbsp;</div>
<p class="p1" ><span class="s1" >18/02/01 05:07:22 ERROR Utils: Exception encountered</span></p>
<p class="p1" ><span class="s1" >java.io.InvalidClassException: org.apache.spark.rpc.RpcEndpointRef;
local class incompatible: stream classdesc serialVersionUID = -1223633663228316618, local
class serialVersionUID = 1835832137613908542</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:687)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1885)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1751)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1885)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1751)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2042)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at java.io.ObjectInputStream.defaultReadObject(ObjectInputStream.java:563)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at org.apache.spark.deploy.master.WorkerInfo$$anonfun$readObject$1.apply$mcV$sp(WorkerInfo.scala:52)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at org.apache.spark.deploy.master.WorkerInfo$$anonfun$readObject$1.apply(WorkerInfo.scala:51)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at org.apache.spark.deploy.master.WorkerInfo$$anonfun$readObject$1.apply(WorkerInfo.scala:51)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1303)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at org.apache.spark.deploy.master.WorkerInfo.readObject(WorkerInfo.scala:51)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at java.lang.reflect.Method.invoke(Method.java:498)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1158)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2178)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at java.io.ObjectInputStream.readObject(ObjectInputStream.java:433)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at org.apache.spark.deploy.master.FileSystemPersistenceEngine.org$apache$spark$deploy$master$FileSystemPersistenceEngine$$deserializeFromFile(FileSystemPersistenceEngine.scala:80)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at org.apache.spark.deploy.master.FileSystemPersistenceEngine$$anonfun$read$1.apply(FileSystemPersistenceEngine.scala:56)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at org.apache.spark.deploy.master.FileSystemPersistenceEngine$$anonfun$read$1.apply(FileSystemPersistenceEngine.scala:56)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:186)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at org.apache.spark.deploy.master.FileSystemPersistenceEngine.read(FileSystemPersistenceEngine.scala:56)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at org.apache.spark.deploy.master.PersistenceEngine$$anonfun$readPersistedData$1.apply(PersistenceEngine.scala:87)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at org.apache.spark.deploy.master.PersistenceEngine$$anonfun$readPersistedData$1.apply(PersistenceEngine.scala:86)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp; &nbsp; &nbsp;
&nbsp; </span>at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:316)</span></p>
<p class="p1" ><span class="s1" ><span>&nbsp;&nbsp; &nbsp; &nbsp;
</span>packet_write_wait: Connection to 9.30.118.193 port 22: Broken pipeData(PersistenceEngine.scala:86)</span></p>​​​​​​​</div>
<div dir="ltr" >&nbsp;</div>
<div dir="ltr" ><div class="socmaildefaultfont" dir="ltr" style="font-family:Arial,
Helvetica, sans-serif;font-size:10.5pt" ><div class="socmaildefaultfont" dir="ltr" style="font-family:Arial;font-size:10.5pt"
><div class="socmaildefaultfont" dir="ltr" style="font-family:Arial;font-size:10.5pt"
><div dir="ltr" id="tpl_normal_20150824133417" style="font-family:arial,helvetica,sans-serif;font-size:9pt;"
>&nbsp;
<table border="0" cellpadding="0" cellspacing="0" style="border:0;table-layout:auto;font-family:
arial,helvetica,sans-serif;white-space:normal;width:650px;font-size:7pt" width="680px" >
       <tbody>                <tr>                        <td style="vertical-align:
bottom;" >                        <div style="font-size:9pt" ><span style="font-size:1.143em;"
><span style="font-family:Times New Roman,Times,serif;" >Regards,</span></span></div>
                       &nbsp;

                        <div class="vcard" style="margin:8px 0 8px 0" ><span style="font-size:1.143em;"
><span style="font-family:Times New Roman,Times,serif;" ><b class="fn n" style="color:#888888;font-size:12pt"
><span class="given-name" >Mihai</span> <span class="family-name" >Iacob</span></b><br>
                       <a href="https://datascience.ibm.com/local" >DSX&nbsp;Local</a>
- Security,&nbsp;IBM Analytics</span></span></div>                 
      </td>                </tr>        </tbody></table></div></div></div></div></div></div><BR>


---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message