If you want to submit applications to a remote cluster where your port 7077 is opened publically, then you would need to set the spark.driver.host (with the public ip of your laptop) and spark.driver.port (optional, if there's no firewall between your laptop and the remote cluster). Keeping your 7077 open for public is a bad idea, you can read more here https://www.sigmoid.com/securing-apache-spark-cluster/

Thanks
Best Regards

On Mon, Jun 1, 2015 at 11:48 PM, AlexG <swiftset@gmail.com> wrote:
I've followed the instructions for setting up a standalone spark cluster (on
EC2):
- install spark on all the machines
- enabled passwordless ssh
- setup the conf/slaves file
- start the master and slaves with the provided scripts

The status on the 8080 port of the master tells me that the master and
executors are all running. I can successfully use pyspark from the master.

However, if I try to call pyspark remotely from my laptop, with
MASTER=spark://<ip>:7077 pyspark,
I get these errors:
15/06/01 10:02:14 INFO AppClient$ClientActor: Connecting to master
akka.tcp://sparkMaster@<IP>:7077/user/Master...
15/06/01 10:02:34 INFO AppClient$ClientActor: Connecting to master
akka.tcp://sparkMaster@<IP>:7077/user/Master...
15/06/01 10:02:54 INFO AppClient$ClientActor: Connecting to master
akka.tcp://sparkMaster@<IP>:7077/user/Master...
15/06/01 10:03:14 ERROR SparkDeploySchedulerBackend: Application has been
killed. Reason: All masters are unresponsive! Giving up.
15/06/01 10:03:14 ERROR TaskSchedulerImpl: Exiting due to error from cluster
scheduler: All masters are unresponsive! Giving up.

Any idea what's going on here? I set port 7077 to be publicly accessible...




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/using-pyspark-with-standalone-cluster-tp23099.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org