spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akhil Das <ak...@sigmoidanalytics.com>
Subject Re: Submitting job from local to EC2 cluster
Date Wed, 26 Nov 2014 07:04:50 GMT
Yes, it is possible to submit jobs to a remote spark cluster. Just make
sure you follow the below steps.

1. Set spark.driver.host to your local ip (Where you runs your code, and it
should be accessible from the cluster)

2. Make sure no firewall/router configurations are blocking/filtering the
connection between your windows machine and the cluster. Best way to test
would be to ping the windows machine's public ip from your cluster. (And if
the pinging is working, then make sure you are portforwaring the required
ports)

3. Also set spark.driver.port if you don't want to open up all the ports on
your windows machine (default is random, so stick to one port)


Thanks
Best Regards

On Wed, Nov 26, 2014 at 5:49 AM, Yingkai Hu <yingkaihu@gmail.com> wrote:

> Hi All,
>
> I have spark deployed to an EC2 cluster and were able to run jobs
> successfully when drive is reside within the cluster. However, job was
> killed when I tried to submit it from local. My guess is spark cluster
> can’t open connection back to the driver since it is on my machine.
>
> I’m wondering if spark actually support submitting jobs from local? If so,
> would you please advise?
>
> Many thanks in advance!
>
> YK
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Mime
View raw message