[ https://issues.apache.org/jira/browse/SPARK-4133?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14189014#comment-14189014
]
Josh Rosen commented on SPARK-4133:
-----------------------------------
Also, could you enable debug logging and share the executor logs? If you're able to reliably
reproduce this bug, please email me at joshrosen@databricks.com and I'd be glad to hop on
Skype to help you configure logging, etc.
> PARSING_ERROR(2) when upgrading issues from 1.0.2 to 1.1.0
> ----------------------------------------------------------
>
> Key: SPARK-4133
> URL: https://issues.apache.org/jira/browse/SPARK-4133
> Project: Spark
> Issue Type: Bug
> Components: Streaming
> Affects Versions: 1.1.0
> Reporter: Antonio Jesus Navarro
> Priority: Blocker
>
> Snappy related problems found when trying to upgrade existing Spark Streaming App from
1.0.2 to 1.1.0.
> We can not run an existing 1.0.2 spark app if upgraded to 1.1.0
> > IOException is thrown by snappy (parsing_error(2))
> > Only spark version changed
> As far as we have checked, snappy will throw this error when dealing with zero bytes
length arrays.
> We have tried:
> > Changing from snappy to LZF,
> > Changing broadcast.compression false
> > Changing from TorrentBroadcast to HTTPBroadcast.
> but with no luck for the moment.
> {code}
> [ERROR] 2014-10-29 11:23:26,396 [Executor task launch worker-0] org.apache.spark.executor.Executor
logError - Exception in task 0.0 in stage 0.0 (TID 0)
> java.io.IOException: PARSING_ERROR(2)
> at org.xerial.snappy.SnappyNative.throw_error(SnappyNative.java:78)
> at org.xerial.snappy.SnappyNative.uncompressedLength(Native Method)
> at org.xerial.snappy.Snappy.uncompressedLength(Snappy.java:545)
> at org.xerial.snappy.SnappyInputStream.readFully(SnappyInputStream.java:125)
> at org.xerial.snappy.SnappyInputStream.readHeader(SnappyInputStream.java:88)
> at org.xerial.snappy.SnappyInputStream.<init>(SnappyInputStream.java:58)
> at org.apache.spark.io.SnappyCompressionCodec.compressedInputStream(CompressionCodec.scala:128)
> at org.apache.spark.broadcast.TorrentBroadcast$.unBlockifyObject(TorrentBroadcast.scala:232)
> at org.apache.spark.broadcast.TorrentBroadcast.readObject(TorrentBroadcast.scala:169)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
> at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
> at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:159)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org
|