spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akhil Das <ak...@sigmoidanalytics.com>
Subject Re: Submitting Spark Applications - Do I need to leave ports open?
Date Mon, 02 Nov 2015 08:38:20 GMT
Yes you need to open up a few ports for that to happen, have a look at
http://spark.apache.org/docs/latest/configuration.html#networking you can
see *.port configurations which bounds to random by default, fix those
ports to a specific number and open those ports in your firewall and it
should work.

Thanks
Best Regards

On Tue, Oct 27, 2015 at 2:05 AM, markluk <mark@juicero.com> wrote:

> I want to submit interactive applications to a remote Spark cluster running
> in standalone mode.
>
> I understand I need to connect to master's 7077 port. It also seems like
> the
> master node need to open connections to my local machine. And the ports
> that
> it needs to open are different every time.
>
> If I have firewall enabled on my local machine, spark-submit fails since
> the
> ports it needs to open on my local machine are unreachable, so spark-submit
> fails to connect to the master.
>
> I was able to get it to work if i disable firewall on my local machine. But
> that's not a real solution.
>
> Is there some config that I'm not aware of that solves this problem?
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Submitting-Spark-Applications-Do-I-need-to-leave-ports-open-tp25207.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Mime
View raw message