spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Matei Zaharia (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (SPARK-2670) FetchFailedException should be thrown when local fetch has failed
Date Fri, 01 Aug 2014 07:03:38 GMT

     [ https://issues.apache.org/jira/browse/SPARK-2670?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Matei Zaharia updated SPARK-2670:
---------------------------------

    Priority: Major  (was: Critical)

> FetchFailedException should be thrown when local fetch has failed
> -----------------------------------------------------------------
>
>                 Key: SPARK-2670
>                 URL: https://issues.apache.org/jira/browse/SPARK-2670
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.0.0
>            Reporter: Kousuke Saruta
>            Assignee: Kousuke Saruta
>             Fix For: 1.1.0
>
>
> In BasicBlockFetchIterator, when remote fetch has failed, then FetchResult which size
is -1 is set to results.
> {code}
>        case None => {
>           logError("Could not get block(s) from " + cmId)
>           for ((blockId, size) <- req.blocks) {
>             results.put(new FetchResult(blockId, -1, null))
>           }
> {code}
> The size -1 means fetch fail and BlockStoreShuffleFetcher#unpackBlock throws FetchFailedException
so that we can retry.
> But, when local fetch has failed, the failed FetchResult is not set.
> So, we cannot retry for the FetchResult.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Mime
View raw message