spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akhil Das <ak...@sigmoidanalytics.com>
Subject Re: Starting a spark streaming app in init.d
Date Sat, 24 Jan 2015 07:30:13 GMT
I'd do the same but put an extra condition to check whether the job has
successfully started or not by checking the application ui (port
availability 4040 would do, if you want more complex one then write a
parser for the same.) after putting the main script on sleep for some time
(say 2 minutes).

Thanks
Best Regards

On Sat, Jan 24, 2015 at 1:57 AM, Ashic Mahtab <ashic@live.com> wrote:

> Hello,
> I'm trying to kick off a spark streaming job to a stand alone master using
> spark submit inside of init.d. This is what I have:
>
>
> DAEMON="spark-submit --class Streamer --executor-memory 500M
> --total-executor-cores 4 /path/to/assembly.jar"
>
> start() {
>         $DAEMON -p /var/run/my_assembly.pid &
>         echo "OK" &&
>         return 0
> }
>
> However, will return 0 even if spark_submit fails. Is there a way to run
> spark-submit in the background and return 0 only if it successfully starts
> up? Or better yet, is there something in spark-submit that will allow me to
> do this, perhaps via a command line argument?
>
> Thanks,
> Ashic.
>

Mime
View raw message