spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ashok Kumar <ashok34...@yahoo.com.INVALID>
Subject Setting up spark to run on two nodes
Date Fri, 18 Mar 2016 22:31:06 GMT
Experts.
Please your valued advice.
I have spark 1.5.2 set up as standalone for now and I have started the master as below
start-master.sh

I also have modified config/slave file to have 
# A Spark Worker will be started on each of the machines listed below.
localhostworkerhost

On the localhost I start slave as follows:
start-slave.shspark:localhost:7077 

Questions.
If I want worker process to be started not only on localhost but also workerhost
1) Do I need just to do start-slave.sh on localhost and it will start the worker process
on other node -> workerhost2) Do I have to runt start-slave.sh spark:workerhost:7077 as
well locally on workerhost3) On GUI http://localhost:4040/environment/ I do not see any reference
to worker process running on workerhost
Appreciate any help on how to go about starting the master on localhost and starting two workers
one on localhost and the other on workerhost
Thanking you

Mime
View raw message