spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Shuxin Yang <shuxinyang....@gmail.com>
Subject how to kill application
Date Tue, 27 Mar 2018 05:17:49 GMT
Hi,

    I apologize if this question was asked before. I try to find the 
answer, but in vain.

    I'm running PySpark on Google Cloud Platform with Spark 2.2.0 and 
YARN resource manager.

    Let S1 be the set of application-ids collected via 'curl 
'http://127.0.0.1:18080/api/v1/applications?status=running'; and S2 be 
the application ids collected via 'yarn application -list'.

    Sometimes I found S1 != S2, how could this this take place?

    For those in the difference of S2 - S1 (i.e. alive YARN app, dead 
Spark app), I can kill them using command 'yarn application -kill id'.

    How can I kill those application in S1 - S2 (i.e. alive Spark app, 
dead YARN app)? Looking not closing the SparkContext could cause this 
problem. However, I'm not always able to close the context, for example 
my program crash prematurely.

    Tons thanks in advance!

Shuxin Yang



---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message