spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Kenichi Maehashi <>
Subject Slave Node Management in Standalone Cluster
Date Tue, 18 Nov 2014 08:27:13 GMT

I'm operating Spark in standalone cluster configuration (3 slaves) and
have some question.

1. How can I stop a slave on the specific node?
   Under `sbin/` directory, there are
`start-{all,master,slave,slaves}` and `stop-{all,master,slaves}`, but
no `stop-slave`. Are there any way to stop the specific (e.g., the
2nd) slave via command line?

2. How can I check cluster status from command line?
   Are there any way to confirm that all Master / Workers are up and
working without using Web UI?

Thanks in advance!

Kenichi Maehashi

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message