spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Silvio Fiorito <>
Subject Re: Stop Cluster Mode Running App
Date Fri, 08 May 2015 00:12:07 GMT
Hi James,

If you’re on Spark 1.3 you can use the kill command in spark-submit to shut it down. You’ll
need the driver id from the Spark UI or from when you submitted the app.

spark-submit --master spark://master:7077 --kill <driver-id>


From: James King
Date: Wednesday, May 6, 2015 at 12:02 PM
To: user
Subject: Stop Cluster Mode Running App

I submitted a Spark Application in cluster mode and now every time I stop the cluster and
restart it the job resumes execution.

I even killed a daemon called DriverWrapper it stops the app but it resumes again.

How can stop this application from running?
View raw message