spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Devl Devel <>
Subject Stop Master and Slaves without SSH
Date Wed, 03 Jun 2015 09:32:34 GMT
Hey All, and make use of SSH to connect to remote
clusters. Are there alternative methods to do this without SSH?

For example using:

./bin/spark-class org.apache.spark.deploy.worker.Worker spark://IP:PORT

is fine but there is no way to kill the Worker without using
stop-slave(s).sh or using ps -ef and then kill.

Are there alternatives available such as Hadoop's:
start|stop xyz?

I noticed exists but maybe we need to increase the
documentation around it, for instance:

 Usage: [--config <conf-dir>] (start|stop|status)
<spark-command> <spark-instance-number> <args>

what are the valid spark-commands? Can this be used to start and stop
workers on the current node?

Many thanks

View raw message