spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akhil Das <ak...@sigmoidanalytics.com>
Subject Re: Spark Standalone on EC2
Date Thu, 09 Oct 2014 17:11:41 GMT
Another work around would be to add the hostnames with ip addresses in all
machines /etc/hosts file

Thanks
Best Regards

On Thu, Oct 9, 2014 at 8:49 PM, Ankur Srivastava <ankur.srivastava@gmail.com
> wrote:

> Thank you Akhil will try this out.
>
> We are able to access the machines using the public IP and even the
> private as they are on our subnet.
>
> Thanks
> Ankur
> On Oct 9, 2014 12:41 AM, "Akhil Das" <akhil@sigmoidanalytics.com> wrote:
>
>> You must be having those hostnames in  your /etc/hosts file, if you are
>> not able to access it using the hostnames then you won't be able access it
>> with the IP address either i believe.
>> What are you trying to do here? running your eclipse locally and
>> connecting to your ec2 cluster?
>>
>> Thanks
>> Best Regards
>>
>> On Tue, Oct 7, 2014 at 3:36 AM, Ankur Srivastava <
>> ankur.srivastava@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> I have started a Spark Cluster on EC2 using Spark Standalone cluster
>>> manager but spark is trying to identify the worker threads using the
>>> hostnames which are not accessible publicly.
>>>
>>> So when I try to submit jobs from eclipse it is failing, is there some
>>> way spark can use IP address instead of hostnames?
>>>
>>> I have used IP address in the slaves file.
>>>
>>> Thanks
>>> Ankur
>>>
>>
>>

Mime
View raw message