spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Marcelo Vanzin <van...@cloudera.com>
Subject Re: Does the delegator map task of SparkLauncher need to stay alive until Spark job finishes ?
Date Wed, 16 Nov 2016 02:20:01 GMT
On Tue, Nov 15, 2016 at 5:57 PM, Elkhan Dadashov <elkhan8502@gmail.com> wrote:
> This is confusing in the sense that, the client needs to stay alive for
> Spark Job to finish successfully.
>
> Actually the client can die  or finish (in Yarn-cluster mode), and the spark
> job will successfully finish.

That's an internal class, and you're looking at an internal javadoc
that describes how the app handle works. For the app handle to be
updated, the "client" (i.e. the sub process) needs to stay alive. So
the javadoc is correct. It has nothing to do with whether the
application succeeds or not.


-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message