spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Andrew Or <andrewo...@gmail.com>
Subject Re: FileNotFoundException on distinct()?
Date Tue, 21 Jan 2014 05:09:18 GMT
Hi Jiacheng,

What change did you make to your code? In particular, did you directly
create an ExternalAppendOnlyMap, or did you use it through an RDD operation?

The error that you got simply means that your code calls next() on
ExternalAppendOnlyMap's iterator when there are no more elements to be read
(i.e. hasNext is false). This should not happen if you use
ExternalAppendOnlyMap through the standard RDD operations such as
reduceByKey or cogroup.

Andrew


On Mon, Jan 20, 2014 at 4:22 AM, guojc <guojc03@gmail.com> wrote:

Hi,
  I'm tring out lastest master branch of spark for the exciting external
hashmap feature. I have a code that is running correctly at spark 0.8.1 and
I only make a change for its easily to be spilled to disk. However, I
encounter a few task failure of
java.util.NoSuchElementException (java.util.NoSuchElementException)
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.next(ExternalAppendOnlyMap.scala:277)
org.apache.spark.util.collection.ExternalAppendOnlyMap$ExternalIterator.next(ExternalAppendOnlyMap.scala:212)
org.apache.spark.InterruptibleIterator.next(InterruptibleIterator.scala:29)
And the job seems to fail to recover.
Can anyone give some suggestion on how to investigate the issue?
Thanks,Jiacheng Guo

Mime
View raw message