spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Niranda Perera <niranda.per...@gmail.com>
Subject Connecting a worker to the master after a spark context is made
Date Fri, 20 Mar 2015 09:13:29 GMT
Hi,

Please consider the following scenario.

I've started the spark master by invoking
the org.apache.spark.deploy.master.Master.startSystemAndActor method in a
java code and connected a worker to it using
the org.apache.spark.deploy.worker.Worker.startSystemAndActor method. and
then I have successfully created a java spark & SQL contexts and performed
SQL queries.

My question is, can I change this order?
Can I start the master first, then create a spark context... and later on
connect a worker to the master?

While trying out this scenario, I have successfully started the master.
Please see the screenshot here.



But when I create an spark context, it terminates automatically. is it
because the master not being connected to a worker?

cheers


-- 
Niranda
‚Äč

Mime
  • Unnamed multipart/related (inline, None, 0 bytes)
View raw message