spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Richard Cross (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-3603) InvalidClassException on a Linux VM - probably problem with serialization
Date Wed, 01 Oct 2014 09:28:33 GMT

    [ https://issues.apache.org/jira/browse/SPARK-3603?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14154600#comment-14154600
] 

Richard Cross commented on SPARK-3603:
--------------------------------------

Hi, I'm working with Tomasz on the same project.  We have 3 ostensibly identical Linux Servers
(same OS/Kernel, same Java, same version of Spark).  

Each one is a completely independent testing environment, running Spark 1.0.0 in standalone
mode with one Spark Master and one Worker.

The problem is that our application works on 1 machine, and fails with the above error on
the other 2... and we cannot find out what is different about those 2 machines that would
cause this error.  We think we have ruled out Endian-ness, as all three machines are Little-Endian.

> InvalidClassException on a Linux VM - probably problem with serialization
> -------------------------------------------------------------------------
>
>                 Key: SPARK-3603
>                 URL: https://issues.apache.org/jira/browse/SPARK-3603
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 1.0.0, 1.1.0
>         Environment: Linux version 2.6.32-358.32.3.el6.x86_64 (mockbuild@x86-029.build.eng.bos.redhat.com)
(gcc version 4.4.7 20120313 (Red Hat 4.4.7-3) (GCC) ) #1 SMP Fri Jan 17 08:42:31 EST 2014
> java version "1.7.0_25"
> OpenJDK Runtime Environment (rhel-2.3.10.4.el6_4-x86_64)
> OpenJDK 64-Bit Server VM (build 23.7-b01, mixed mode)
> Spark (either 1.0.0 or 1.1.0)
>            Reporter: Tomasz Dudziak
>            Priority: Critical
>              Labels: scala, serialization, spark
>
> I have a Scala app connecting to a standalone Spark cluster. It works fine on Windows
or on a Linux VM; however, when I try to run the app and the Spark cluster on another Linux
VM (the same Linux kernel, Java and Spark - tested for versions 1.0.0 and 1.1.0) I get the
below exception. This looks kind of similar to the Big-Endian (IBM Power7) Spark Serialization
issue (SPARK-2018), but... my system is definitely little endian and I understand the big
endian issue should be already fixed in Spark 1.1.0 anyway. I'd appreaciate your help.
> 01:34:53.251 WARN  [Result resolver thread-0][TaskSetManager] Lost TID 2 (task 1.0:2)
> 01:34:53.278 WARN  [Result resolver thread-0][TaskSetManager] Loss was due to java.io.InvalidClassException
> java.io.InvalidClassException: scala.reflect.ClassTag$$anon$1; local class incompatible:
stream classdesc serialVersionUID = -4937928798201944954, local class serialVersionUID = -8102093212602380348
>         at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)
>         at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1620)
>         at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1515)
>         at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1769)
>         at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1348)
>         at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1989)
>         at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1913)
>         at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1796)
>         at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1348)
>         at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1989)
>         at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1913)
>         at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1796)
>         at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1348)
>         at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>         at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>         at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1891)
>         at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1796)
>         at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1348)
>         at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1989)
>         at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1913)
>         at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1796)
>         at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1348)
>         at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1989)
>         at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1913)
>         at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1796)
>         at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1348)
>         at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>         at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 		at java.lang.reflect.Method.invoke(Method.java:606)
>         at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>         at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1891)
>         at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1796)
>         at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1348)
>         at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1989)
>         at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1913)
>         at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1796)
>         at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1348)
>         at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1989)
>         at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1913)
>         at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1796)
>         at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1348)
>         at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1704)
>         at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1342)
>         at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1989)
>         at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1913)
>         at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1796)
>         at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1348)
>         at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1989)
>         at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1913)
>         at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1796)
>         at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1348)
>         at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1989)
>         at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1913)
>         at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1796)
>         at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1348)
>         at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>         at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
>         at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1891)
>         at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1796)
>         at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1348)
>         at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1989)
>         at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1913)
>         at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1796)
>         at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1348)
>         at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>         at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:63)
>         at org.apache.spark.scheduler.ShuffleMapTask$.deserializeInfo(ShuffleMapTask.scala:63)
>         at org.apache.spark.scheduler.ShuffleMapTask.readExternal(ShuffleMapTask.scala:135)
>         at java.io.ObjectInputStream.readExternalData(ObjectInputStream.java:1835)
>         at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1794)
>         at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1348)
>         at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>         at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:63)
>         at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:85)
>         at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:169)
>         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>         at java.lang.Thread.run(Thread.java:724)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message