spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Soumya Simanta <soumya.sima...@gmail.com>
Subject Does start-slave.sh use the values in conf/slaves to launch a worker in Spark standalone cluster mode
Date Tue, 21 Oct 2014 04:55:12 GMT
I'm working a cluster where I need to start the workers separately and
connect them to a master.

I'm following the instructions here and using branch-1.1
http://spark.apache.org/docs/latest/spark-standalone.html#starting-a-cluster-manually

and I can start the master using
./sbin/start-master.sh

When I try to start the slave/worker using
./sbin/start-slave.sh it does't work. The logs say that it needs the
master.
when I provide
./sbin/start-slave.sh spark://<master-ip>:7077 it still doesn't work.

I can start the worker using the following command (as described in the
documentation).

./bin/spark-class org.apache.spark.deploy.worker.Worker spark://IP:PORT

Was wondering why start-slave.sh is not working?

Thanks
-Soumya

Mime
View raw message