mahout-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sean Owen <sro...@gmail.com>
Subject Re: Cancel running distributed RecommenderJob
Date Mon, 02 Apr 2012 18:05:27 GMT
You can use the Hadoop interface itself (like, the command-line hadoop
tool) to kill a job by its ID. If you kill one MapReduce job the
entire process should halt after that.

On Mon, Apr 2, 2012 at 6:44 PM, Sören Brunk <soren.brunk@deri.org> wrote:
> Hi,
>
> I'm using the distributed RecommenderJob from within a Java program.
> For that, in a separate thread, I'm creating a RecommenderJob object, call
> setConf() for the hadoop configuration and then run() with the job
> parameters.
> This is working fine for me but now I would like to be able to stop a
> running job.
> Not sure if that's possible at all since RecommenderJob encapsulates several
> Hadoop jobs (or even other Mahout jobs that call Hadoop in turn) and runs
> them in a blocking way.
>
> Would be interesting for other Mahout jobs as well.
> Any ideas?
>
> Thanks,
>
> --
> Sören Brunk
> Research Assistant
> Data Intensive Infrastructures Unit (DI2)
> Digital Enterprise Research Institute
> National University of Ireland Galway
>

Mime
View raw message