spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Devl Devel <devl.developm...@gmail.com>
Subject Stop Master and Slaves without SSH
Date Wed, 03 Jun 2015 09:32:34 GMT
Hey All,

start-slaves.sh and stop-slaves.sh make use of SSH to connect to remote
clusters. Are there alternative methods to do this without SSH?

For example using:

./bin/spark-class org.apache.spark.deploy.worker.Worker spark://IP:PORT

is fine but there is no way to kill the Worker without using
stop-slave(s).sh or using ps -ef and then kill.

Are there alternatives available such as Hadoop's: hadoop-daemon.sh
start|stop xyz?

I noticed spark-daemon.sh exists but maybe we need to increase the
documentation around it, for instance:

 Usage: spark-daemon.sh [--config <conf-dir>] (start|stop|status)
<spark-command> <spark-instance-number> <args>

what are the valid spark-commands? Can this be used to start and stop
workers on the current node?

Many thanks
Devl

Mime
View raw message