spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ashic Mahtab <as...@live.com>
Subject RE: Starting a spark streaming app in init.d
Date Sat, 24 Jan 2015 10:31:05 GMT
Cool. I was thinking of waiting a second and doing ps aux | grep java | grep jarname.jar, and
I guess checking 4040 would work as as well. Thanks for the input.
Regards,Ashic.

Date: Sat, 24 Jan 2015 13:00:13 +0530
Subject: Re: Starting a spark streaming app in init.d
From: akhil@sigmoidanalytics.com
To: ashic@live.com
CC: user@spark.apache.org

I'd do the same but put an extra condition to check whether the job has successfully started
or not by checking the application ui (port availability 4040 would do, if you want more complex
one then write a parser for the same.) after putting the main script on sleep for some time
(say 2 minutes).ThanksBest Regards

On Sat, Jan 24, 2015 at 1:57 AM, Ashic Mahtab <ashic@live.com> wrote:



Hello,
I'm trying to kick off a spark streaming job to a stand alone master using spark submit inside
of init.d. This is what I have:


DAEMON="spark-submit --class Streamer --executor-memory 500M --total-executor-cores 4 /path/to/assembly.jar"

start() {
        $DAEMON -p /var/run/my_assembly.pid &
        echo "OK" &&
        return 0
}

However, will return 0 even if spark_submit fails. Is there a way to run spark-submit in the
background and return 0 only if it successfully starts up? Or better yet, is there something
in spark-submit that will allow me to do this, perhaps via a command line argument?

Thanks,
Ashic.
 		 	   		  

 		 	   		  
Mime
View raw message