spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mark Hamstra <m...@clearstorydata.com>
Subject Re: Receiving intermediary results of a Spark operation
Date Mon, 04 Nov 2013 21:16:20 GMT
Sorry, meant to include the link:
http://spark.incubator.apache.org/docs/latest/scala-programming-guide.html#accumulators



On Mon, Nov 4, 2013 at 1:15 PM, Mark Hamstra <mark@clearstorydata.com>wrote:

> You probably want to be looking at accumulators.
>
>
> On Mon, Nov 4, 2013 at 12:22 PM, Markus Losoi <markus.losoi@gmail.com>wrote:
>
>> Hi
>>
>> Is it possible for a driver program to receive intermediary results of a
>> Spark operation? If, e.g., a long map() operation is in progress, can the
>> driver become aware of some of the (key, value) pairs before all of them
>> are
>> computed?
>>
>> There seems to be SparkListener interface that has an onTaskEnd() event
>> [1].
>> However, the documentation is somewhat sparse on what kind of information
>> is
>> included in a SparkListenerTaskEnd object [2].
>>
>> [1]
>>
>> http://spark.incubator.apache.org/docs/0.8.0/api/core/org/apache/spark/sched
>> uler/SparkListener.html<http://spark.incubator.apache.org/docs/0.8.0/api/core/org/apache/spark/scheduler/SparkListener.html>
>> [2]
>>
>> http://spark.incubator.apache.org/docs/0.8.0/api/core/org/apache/spark/sched
>> uler/SparkListenerTaskEnd.html<http://spark.incubator.apache.org/docs/0.8.0/api/core/org/apache/spark/scheduler/SparkListenerTaskEnd.html>
>>
>> Best regards,
>> Markus Losoi (markus.losoi@gmail.com)
>>
>>
>

Mime
View raw message