spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Jahagirdar, Madhu" <madhu.jahagir...@philips.com>
Subject RE: Dstream Transformations
Date Mon, 06 Oct 2014 07:59:47 GMT
Given that I have multiple worker nodes and when Spark schedules the job again on the worker
nodes that are alive, does it then again store the data in elastic search and then flume or
does it only run functions to store in flume ?

Regards,
Madhu Jahagirdar

________________________________
From: Akhil Das [akhil@sigmoidanalytics.com]
Sent: Monday, October 06, 2014 1:20 PM
To: Jahagirdar, Madhu
Cc: user
Subject: Re: Dstream Transformations

AFAIK spark doesn't restart worker nodes itself. You can have multiple worker nodes and in
that case if one worker node goes down, then spark will try to recompute those lost RDDs again
with those workers who are alive.

Thanks
Best Regards

On Sun, Oct 5, 2014 at 5:19 AM, Jahagirdar, Madhu <madhu.jahagirdar@philips.com<mailto:madhu.jahagirdar@philips.com>>
wrote:
In my spark streaming program I have created kafka utils to receive data and store data in
elastic search and in flume. Storing function is applied on same dstream. My question what
is the behavior of spark if after storing data in elastic search the worker node dies before
storing in flume? Does it  restart worker and then again store the data in elastic search
and then flume or does it only run functions to store in flume.

Regards
Madhu Jahagirdar

________________________________
The information contained in this message may be confidential and legally protected under
applicable law. The message is intended solely for the addressee(s). If you are not the intended
recipient, you are hereby notified that any use, forwarding, dissemination, or reproduction
of this message is strictly prohibited and may be unlawful. If you are not the intended recipient,
please contact the sender by return e-mail and destroy all copies of the original message.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org<mailto:user-unsubscribe@spark.apache.org>
For additional commands, e-mail: user-help@spark.apache.org<mailto:user-help@spark.apache.org>



Mime
View raw message