spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Matt Cheah <mch...@palantir.com>
Subject Serializable incompatible with Externalizable error
Date Tue, 03 Dec 2013 03:15:09 GMT
Hi everyone,

I'm running into a case where I'm creating a Java RDD of an Externalizable class, and getting
this stack trace:

java.io.InvalidClassException (java.io.InvalidClassException: com.palantir.finance.datatable.server.spark.WritableDataRow;
Serializable incompatible with Externalizable)
java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:634)java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)java.io.ObjectInputStream.readClass(ObjectInputStream.java:1483)<some
other Java stuff>java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:39)org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:61)org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:153)
I'm running on a spark cluster generated by the EC2 Scripts. This doesn't happen if I'm running
things with local[N]. Any ideas?
Thanks,
-Matt Cheah

Mime
View raw message