spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mark Hamstra <m...@clearstorydata.com>
Subject Re: How to track batch jobs in spark ?
Date Wed, 05 Dec 2018 23:11:59 GMT
That will kill an entire Spark application, not a batch Job.

On Wed, Dec 5, 2018 at 3:07 PM Priya Matpadi <pmatpadi@gmail.com> wrote:

> if you are deploying your spark application on YARN cluster,
> 1. ssh into master node
> 2. List the currently running application and retreive the application_id
>     yarn application --list
> 3. Kill the application using application_id of the form
> application_xxxxx_xxxx from output of list command
>         yarn application --kill <application_id>
>
> On Wed, Dec 5, 2018 at 1:42 PM kant kodali <kanth909@gmail.com> wrote:
>
>> Hi All,
>>
>> How to track batch jobs in spark? For example, is there some id or token
>> i can get after I spawn a batch job and use it to track the progress or to
>> kill the batch job itself?
>>
>> For Streaming, we have StreamingQuery.id()
>>
>> Thanks!
>>
>

Mime
View raw message