spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Soumya Simanta <>
Subject Does use the values in conf/slaves to launch a worker in Spark standalone cluster mode
Date Tue, 21 Oct 2014 04:55:12 GMT
I'm working a cluster where I need to start the workers separately and
connect them to a master.

I'm following the instructions here and using branch-1.1

and I can start the master using

When I try to start the slave/worker using
./sbin/ it does't work. The logs say that it needs the
when I provide
./sbin/ spark://<master-ip>:7077 it still doesn't work.

I can start the worker using the following command (as described in the

./bin/spark-class org.apache.spark.deploy.worker.Worker spark://IP:PORT

Was wondering why is not working?


View raw message