spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jun Feng Liu <liuj...@cn.ibm.com>
Subject Re: HA support for Spark
Date Wed, 10 Dec 2014 13:30:24 GMT
Well, it should not be mission impossible thinking there are so many HA 
solution existing today. I would interest to know if there is any specific 
difficult.
 
Best Regards
 
Jun Feng Liu
IBM China Systems & Technology Laboratory in Beijing



Phone: 86-10-82452683 
E-mail: liujunf@cn.ibm.com


BLD 28,ZGC Software Park 
No.8 Rd.Dong Bei Wang West, Dist.Haidian Beijing 100193 
China 
 

 



Reynold Xin <rxin@databricks.com> 
2014/12/10 16:30

To
Jun Feng Liu/China/IBM@IBMCN, 
cc
"dev@spark.apache.org" <dev@spark.apache.org>
Subject
Re: HA support for Spark






This would be plausible for specific purposes such as Spark streaming or
Spark SQL, but I don't think it is doable for general Spark driver since 
it
is just a normal JVM process with arbitrary program state.

On Wed, Dec 10, 2014 at 12:25 AM, Jun Feng Liu <liujunf@cn.ibm.com> wrote:

> Do we have any high availability support in Spark driver level? For
> example, if we want spark drive can move to another node continue 
execution
> when failure happen. I can see the RDD checkpoint can help to 
serialization
> the status of RDD. I can image to load the check point from another node
> when error happen, but seems like will lost track all tasks status or 
even
> executor information that maintain in spark context. I am not sure if 
there
> is any existing stuff I can leverage to do that. thanks for any suggests
>
> Best Regards
>
>
> *Jun Feng Liu*
> IBM China Systems & Technology Laboratory in Beijing
>
>   ------------------------------
>  [image: 2D barcode - encoded with contact information] *Phone: 
*86-10-82452683
>
> * E-mail:* *liujunf@cn.ibm.com* <liujunf@cn.ibm.com>
> [image: IBM]
>
> BLD 28,ZGC Software Park
> No.8 Rd.Dong Bei Wang West, Dist.Haidian Beijing 100193
> China
>
>
>
>
>


Mime
  • Unnamed multipart/related (inline, None, 0 bytes)
View raw message