spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Jakub Dubovsky (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-16599) java.util.NoSuchElementException: None.get at at org.apache.spark.storage.BlockInfoManager.releaseAllLocksForTask(BlockInfoManager.scala:343)
Date Thu, 02 Feb 2017 09:48:52 GMT

    [ https://issues.apache.org/jira/browse/SPARK-16599?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15849732#comment-15849732
] 

Jakub Dubovsky commented on SPARK-16599:
----------------------------------------

[~yetsun] Good. Do you use the same notebook?

Spark guys above said that this NoSuchElementException shouldn't be happening. So while this
looks like issue in sparkNB it might be something not good in spark as well. Exception from
BlockInfoManager is not expected way how missing class definition (or similar) issue should
manifest itself.

> java.util.NoSuchElementException: None.get  at at org.apache.spark.storage.BlockInfoManager.releaseAllLocksForTask(BlockInfoManager.scala:343)
> ----------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-16599
>                 URL: https://issues.apache.org/jira/browse/SPARK-16599
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 2.0.0
>         Environment: centos 6.7   spark 2.0
>            Reporter: binde
>
> run a spark job with spark 2.0, error message
> Job aborted due to stage failure: Task 0 in stage 821.0 failed 4 times, most recent failure:
Lost task 0.3 in stage 821.0 (TID 1480, e103): java.util.NoSuchElementException: None.get
> 	at scala.None$.get(Option.scala:347)
> 	at scala.None$.get(Option.scala:345)
> 	at org.apache.spark.storage.BlockInfoManager.releaseAllLocksForTask(BlockInfoManager.scala:343)
> 	at org.apache.spark.storage.BlockManager.releaseAllLocksForTask(BlockManager.scala:644)
> 	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:281)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> 	at java.lang.Thread.run(Thread.java:745)



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message