spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akhil Das <ak...@sigmoidanalytics.com>
Subject Re: Spark Standalone on EC2
Date Thu, 09 Oct 2014 07:41:44 GMT
You must be having those hostnames in  your /etc/hosts file, if you are not
able to access it using the hostnames then you won't be able access it with
the IP address either i believe.
What are you trying to do here? running your eclipse locally and connecting
to your ec2 cluster?

Thanks
Best Regards

On Tue, Oct 7, 2014 at 3:36 AM, Ankur Srivastava <ankur.srivastava@gmail.com
> wrote:

> Hi,
>
> I have started a Spark Cluster on EC2 using Spark Standalone cluster
> manager but spark is trying to identify the worker threads using the
> hostnames which are not accessible publicly.
>
> So when I try to submit jobs from eclipse it is failing, is there some way
> spark can use IP address instead of hostnames?
>
> I have used IP address in the slaves file.
>
> Thanks
> Ankur
>

Mime
View raw message