spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Park Kyeong Hee <kh1979.p...@samsung.com>
Subject RE: Stop Spark Streaming Jobs
Date Wed, 03 Aug 2016 02:24:05 GMT
Hi. Paradeep


Did you mean, how to kill the job?
If yes, you should kill the driver and follow next.

on yarn-client
1. find pid - "ps -es | grep <your_jobs_main_class>"
2. kill it - "kill -9 <pid>"
3. check executors were down - "yarn application -list"

on yarn-cluster
1. find driver's application ID - "yarn application -list"
2. stop it - "yarn application -kill <app_ID>"
3. check driver and executors were down - "yarn application -list"


Thanks.

-----Original Message-----
From: Pradeep [mailto:pradeep.misra@mail.com] 
Sent: Wednesday, August 03, 2016 10:48 AM
To: user@spark.apache.org
Subject: Stop Spark Streaming Jobs

Hi All,

My streaming job reads data from Kafka. The job is triggered and pushed to
background with nohup.

What are the recommended ways to stop job either on yarn-client or cluster
mode.

Thanks,
Pradeep

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org




---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message