spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Aida Tefera <aida1.tef...@gmail.com>
Subject Re: Installing Spark on Mac
Date Tue, 15 Mar 2016 12:41:26 GMT
Hi Jakob, sorry for my late reply

I tried to run the below; came back with "netstat: lunt: unknown or uninstrumented protocol

I also tried uninstalling version 1.6.0 and installing version1.5.2 with Java 7 and SCALA
version 2.10.6; got the same error messages

Do you think it would be worth me trying to change the IP address in SPARK_MASTER_IP to the
IP address of the master node? If so, how would I go about doing that? 

Thanks, 

Aida

Sent from my iPhone

> On 11 Mar 2016, at 08:37, Jakob Odersky <jakob@odersky.com> wrote:
> 
> regarding my previous message, I forgot to mention to run netstat as
> root (sudo netstat -plunt)
> sorry for the noise
> 
>> On Fri, Mar 11, 2016 at 12:29 AM, Jakob Odersky <jakob@odersky.com> wrote:
>> Some more diagnostics/suggestions:
>> 
>> 1) are other services listening to ports in the 4000 range (run
>> "netstat -plunt")? Maybe there is an issue with the error message
>> itself.
>> 
>> 2) are you sure the correct java version is used? java -version
>> 
>> 3) can you revert all installation attempts you have done so far,
>> including files installed by brew/macports or maven and try again?
>> 
>> 4) are there any special firewall rules in place, forbidding
>> connections on localhost?
>> 
>> This is very weird behavior you're seeing. Spark is supposed to work
>> out-of-the-box with ZERO configuration necessary for running a local
>> shell. Again, my prime suspect is a previous, failed Spark
>> installation messing up your config.
>> 
>>> On Thu, Mar 10, 2016 at 12:24 PM, Tristan Nixon <stuff@memeticlabs.org>
wrote:
>>> If you type ‘whoami’ in the terminal, and it responds with ‘root’ then
you’re the superuser.
>>> However, as mentioned below, I don’t think its a relevant factor.
>>> 
>>>> On Mar 10, 2016, at 12:02 PM, Aida Tefera <aida1.tefera@gmail.com>
wrote:
>>>> 
>>>> Hi Tristan,
>>>> 
>>>> I'm afraid I wouldn't know whether I'm running it as super user.
>>>> 
>>>> I have java version 1.8.0_73 and SCALA version 2.11.7
>>>> 
>>>> Sent from my iPhone
>>>> 
>>>>> On 9 Mar 2016, at 21:58, Tristan Nixon <stuff@memeticlabs.org>
wrote:
>>>>> 
>>>>> That’s very strange. I just un-set my SPARK_HOME env param, downloaded
a fresh 1.6.0 tarball,
>>>>> unzipped it to local dir (~/Downloads), and it ran just fine - the driver
port is some randomly generated large number.
>>>>> So SPARK_HOME is definitely not needed to run this.
>>>>> 
>>>>> Aida, you are not running this as the super-user, are you?  What versions
of Java & Scala do you have installed?
>>>>> 
>>>>>> On Mar 9, 2016, at 3:53 PM, Aida Tefera <aida1.tefera@gmail.com>
wrote:
>>>>>> 
>>>>>> Hi Jakob,
>>>>>> 
>>>>>> Tried running the command env|grep SPARK; nothing comes back
>>>>>> 
>>>>>> Tried env|grep Spark; which is the directory I created for Spark
once I downloaded the tgz file; comes back with PWD=/Users/aidatefera/Spark
>>>>>> 
>>>>>> Tried running ./bin/spark-shell ; comes back with same error as below;
i.e could not bind to port 0 etc.
>>>>>> 
>>>>>> Sent from my iPhone
>>>>>> 
>>>>>>> On 9 Mar 2016, at 21:42, Jakob Odersky <jakob@odersky.com>
wrote:
>>>>>>> 
>>>>>>> As Tristan mentioned, it looks as though Spark is trying to bind
on
>>>>>>> port 0 and then 1 (which is not allowed). Could it be that some
>>>>>>> environment variables from you previous installation attempts
are
>>>>>>> polluting your configuration?
>>>>>>> What does running "env | grep SPARK" show you?
>>>>>>> 
>>>>>>> Also, try running just "/bin/spark-shell" (without the --master
>>>>>>> argument), maybe your shell is doing some funky stuff with the
>>>>>>> brackets.
>>>>>> 
>>>>>> ---------------------------------------------------------------------
>>>>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>>>>> For additional commands, e-mail: user-help@spark.apache.org
>>>> 
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>>> For additional commands, e-mail: user-help@spark.apache.org
>> 
>>> On Thu, Mar 10, 2016 at 12:24 PM, Tristan Nixon <stuff@memeticlabs.org>
wrote:
>>> If you type ‘whoami’ in the terminal, and it responds with ‘root’ then
you’re the superuser.
>>> However, as mentioned below, I don’t think its a relevant factor.
>>> 
>>>> On Mar 10, 2016, at 12:02 PM, Aida Tefera <aida1.tefera@gmail.com>
wrote:
>>>> 
>>>> Hi Tristan,
>>>> 
>>>> I'm afraid I wouldn't know whether I'm running it as super user.
>>>> 
>>>> I have java version 1.8.0_73 and SCALA version 2.11.7
>>>> 
>>>> Sent from my iPhone
>>>> 
>>>>> On 9 Mar 2016, at 21:58, Tristan Nixon <stuff@memeticlabs.org>
wrote:
>>>>> 
>>>>> That’s very strange. I just un-set my SPARK_HOME env param, downloaded
a fresh 1.6.0 tarball,
>>>>> unzipped it to local dir (~/Downloads), and it ran just fine - the driver
port is some randomly generated large number.
>>>>> So SPARK_HOME is definitely not needed to run this.
>>>>> 
>>>>> Aida, you are not running this as the super-user, are you?  What versions
of Java & Scala do you have installed?
>>>>> 
>>>>>> On Mar 9, 2016, at 3:53 PM, Aida Tefera <aida1.tefera@gmail.com>
wrote:
>>>>>> 
>>>>>> Hi Jakob,
>>>>>> 
>>>>>> Tried running the command env|grep SPARK; nothing comes back
>>>>>> 
>>>>>> Tried env|grep Spark; which is the directory I created for Spark
once I downloaded the tgz file; comes back with PWD=/Users/aidatefera/Spark
>>>>>> 
>>>>>> Tried running ./bin/spark-shell ; comes back with same error as below;
i.e could not bind to port 0 etc.
>>>>>> 
>>>>>> Sent from my iPhone
>>>>>> 
>>>>>>> On 9 Mar 2016, at 21:42, Jakob Odersky <jakob@odersky.com>
wrote:
>>>>>>> 
>>>>>>> As Tristan mentioned, it looks as though Spark is trying to bind
on
>>>>>>> port 0 and then 1 (which is not allowed). Could it be that some
>>>>>>> environment variables from you previous installation attempts
are
>>>>>>> polluting your configuration?
>>>>>>> What does running "env | grep SPARK" show you?
>>>>>>> 
>>>>>>> Also, try running just "/bin/spark-shell" (without the --master
>>>>>>> argument), maybe your shell is doing some funky stuff with the
>>>>>>> brackets.
>>>>>> 
>>>>>> ---------------------------------------------------------------------
>>>>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>>>>> For additional commands, e-mail: user-help@spark.apache.org
>>>> 
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>>> For additional commands, e-mail: user-help@spark.apache.org
>>> 

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message