Hi, 

Please consider the following scenario. 

I've started the spark master by invoking the org.apache.spark.deploy.master.Master.startSystemAndActor method in a java code and connected a worker to it using the org.apache.spark.deploy.worker.Worker.startSystemAndActor method. and then I have successfully created a java spark & SQL contexts and performed SQL queries. 

My question is, can I change this order? 
Can I start the master first, then create a spark context... and later on connect a worker to the master? 

While trying out this scenario, I have successfully started the master. Please see the screenshot here. 



But when I create an spark context, it terminates automatically. is it because the master not being connected to a worker?

cheers


--
Niranda