spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Tristan Nixon <st...@memeticlabs.org>
Subject Re: Installing Spark on Mac
Date Wed, 09 Mar 2016 21:35:56 GMT
No, those look like the right directions… It *should* work, but clearly is not. Hrrrrmmm…

You can check if the spark home is set by:
echo $SPARK_HOME

but that doesn’t seem to be the issue.

> On Mar 9, 2016, at 2:58 PM, Aida Tefera <aida1.tefera@gmail.com> wrote:
> 
> Hi Tristan, my apologies, I meant to write Spark and not SCALA 
> 
> I feel a bit lost at the moment...
> 
> Perhaps I have missed steps that are implicit to more experienced people
> 
> Apart from downloading spark and then following Jakob's steps:
> 
> 1. curlhttp://apache.arvixe.com/spark/spark-1.6.0/spark-1.6.0-bin-hadoop2.6.tgz <http://apache.arvixe.com/spark/spark-1.6.0/spark-1.6.0-bin-hadoop2.6.tgz>
> -O
> 
> 2. tar -xzf spark-1.6.0-bin-hadoop2.6.tgz
> 
> 3. cd spark-1.6.0-bin-hadoop2.6
> 
> 4. ./bin/spark-shell --master local[2]
> 
> 
> Was I supposed to do something additional to this? 
> 
> How would I be able to determine if the spark_home is looking into a different directory?
> 
> 
> Sent from my iPhone
> 
>> On 9 Mar 2016, at 20:39, Tristan Nixon <stuff@memeticlabs.org <mailto:stuff@memeticlabs.org>>
wrote:
>> 
>> SPARK_HOME and SCALA_HOME are different. I was just wondering whether spark is looking
in a different dir for the config files than where you’re running it. If you have not set
SPARK_HOME, it should look in the current directory for the /conf dir.
>> 
>> The defaults should be relatively safe, I’ve been using them with local mode on
my Mac for a long while without any need to change them.
>> 
>>> On Mar 9, 2016, at 2:20 PM, Aida Tefera <aida1.tefera@gmail.com <mailto:aida1.tefera@gmail.com>>
wrote:
>>> 
>>> I don't think I set the SCALA_HOME environment variable
>>> 
>>> Also, I'm unsure whether or not I should launch the scripts defaults to a single
machine(local host)
>>> 
>>> Sent from my iPhone
>>> 
>>>> On 9 Mar 2016, at 19:59, Tristan Nixon <stuff@memeticlabs.org <mailto:stuff@memeticlabs.org>>
wrote:
>>>> 
>>>> Also, do you have the SPARK_HOME environment variable set in your shell,
and if so what is it set to?
>>>> 
>>>>> On Mar 9, 2016, at 1:53 PM, Tristan Nixon <stuff@memeticlabs.org <mailto:stuff@memeticlabs.org>>
wrote:
>>>>> 
>>>>> There should be a /conf sub-directory wherever you installed spark, which
contains several configuration files.
>>>>> I believe that the two that you should look at are
>>>>> spark-defaults.conf
>>>>> spark-env.sh
>>>>> 
>>>>> 
>>>>>> On Mar 9, 2016, at 1:45 PM, Aida Tefera <aida1.tefera@gmail.com
<mailto:aida1.tefera@gmail.com>> wrote:
>>>>>> 
>>>>>> Hi Tristan, thanks for your message
>>>>>> 
>>>>>> When I look at the spark-defaults.conf.template it shows a spark
example(spark://master:7077) where the port is 7077
>>>>>> 
>>>>>> When you say look to the conf scripts, how do you mean?
>>>>>> 
>>>>>> Sent from my iPhone
>>>>>> 
>>>>>>> On 9 Mar 2016, at 19:32, Tristan Nixon <stuff@memeticlabs.org
<mailto:stuff@memeticlabs.org>> wrote:
>>>>>>> 
>>>>>>> Yeah, according to the standalone documentation
>>>>>>> http://spark.apache.org/docs/latest/spark-standalone.html <http://spark.apache.org/docs/latest/spark-standalone.html>
>>>>>>> 
>>>>>>> the default port should be 7077, which means that something must
be overriding this on your installation - look to the conf scripts!
>>>>>>> 
>>>>>>>> On Mar 9, 2016, at 1:26 PM, Tristan Nixon <stuff@memeticlabs.org
<mailto:stuff@memeticlabs.org>> wrote:
>>>>>>>> 
>>>>>>>> Looks like it’s trying to bind on port 0, then 1.
>>>>>>>> Often the low-numbered ports are restricted to system processes
and “established” servers (web, ssh, etc.) and
>>>>>>>> so user programs are prevented from binding on them. The
default should be to run on a high-numbered port like 8080 or such.
>>>>>>>> 
>>>>>>>> What do you have in your spark-env.sh?
>>>>>>>> 
>>>>>>>>> On Mar 9, 2016, at 12:35 PM, Aida <Aida1.Tefera@gmail.com
<mailto:Aida1.Tefera@gmail.com>> wrote:
>>>>>>>>> 
>>>>>>>>> Hi everyone, thanks for all your support
>>>>>>>>> 
>>>>>>>>> I went with your suggestion Cody/Jakob and downloaded
a pre-built version
>>>>>>>>> with Hadoop this time and I think I am finally making
some progress :)
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> ukdrfs01:spark-1.6.0-bin-hadoop2.6 aidatefera$ ./bin/spark-shell
--master
>>>>>>>>> local[2]
>>>>>>>>> log4j:WARN No appenders could be found for logger
>>>>>>>>> (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
>>>>>>>>> log4j:WARN Please initialize the log4j system properly.
>>>>>>>>> log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig
<http://logging.apache.org/log4j/1.2/faq.html#noconfig> for
>>>>>>>>> more info.
>>>>>>>>> Using Spark's repl log4j profile:
>>>>>>>>> org/apache/spark/log4j-defaults-repl.properties
>>>>>>>>> To adjust logging level use sc.setLogLevel("INFO")
>>>>>>>>> Welcome to
>>>>>>>>> ____              __
>>>>>>>>> / __/__  ___ _____/ /__
>>>>>>>>> _\ \/ _ \/ _ `/ __/  '_/
>>>>>>>>> /___/ .__/\_,_/_/ /_/\_\   version 1.6.0
>>>>>>>>> /_/
>>>>>>>>> 
>>>>>>>>> Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server
VM, Java
>>>>>>>>> 1.8.0_73)
>>>>>>>>> Type in expressions to have them evaluated.
>>>>>>>>> Type :help for more information.
>>>>>>>>> 16/03/09 18:26:57 WARN Utils: Service 'sparkDriver' could
not bind on port
>>>>>>>>> 0. Attempting port 1.
>>>>>>>>> 16/03/09 18:26:57 WARN Utils: Service 'sparkDriver' could
not bind on port
>>>>>>>>> 0. Attempting port 1.
>>>>>>>>> 16/03/09 18:26:57 WARN Utils: Service 'sparkDriver' could
not bind on port
>>>>>>>>> 0. Attempting port 1.
>>>>>>>>> 16/03/09 18:26:57 WARN Utils: Service 'sparkDriver' could
not bind on port
>>>>>>>>> 0. Attempting port 1.
>>>>>>>>> 16/03/09 18:26:57 WARN Utils: Service 'sparkDriver' could
not bind on port
>>>>>>>>> 0. Attempting port 1.
>>>>>>>>> 16/03/09 18:26:57 WARN Utils: Service 'sparkDriver' could
not bind on port
>>>>>>>>> 0. Attempting port 1.
>>>>>>>>> 16/03/09 18:26:57 WARN Utils: Service 'sparkDriver' could
not bind on port
>>>>>>>>> 0. Attempting port 1.
>>>>>>>>> 16/03/09 18:26:57 WARN Utils: Service 'sparkDriver' could
not bind on port
>>>>>>>>> 0. Attempting port 1.
>>>>>>>>> 16/03/09 18:26:57 WARN Utils: Service 'sparkDriver' could
not bind on port
>>>>>>>>> 0. Attempting port 1.
>>>>>>>>> 16/03/09 18:26:57 WARN Utils: Service 'sparkDriver' could
not bind on port
>>>>>>>>> 0. Attempting port 1.
>>>>>>>>> 16/03/09 18:26:57 WARN Utils: Service 'sparkDriver' could
not bind on port
>>>>>>>>> 0. Attempting port 1.
>>>>>>>>> 16/03/09 18:26:57 WARN Utils: Service 'sparkDriver' could
not bind on port
>>>>>>>>> 0. Attempting port 1.
>>>>>>>>> 16/03/09 18:26:57 WARN Utils: Service 'sparkDriver' could
not bind on port
>>>>>>>>> 0. Attempting port 1.
>>>>>>>>> 16/03/09 18:26:57 WARN Utils: Service 'sparkDriver' could
not bind on port
>>>>>>>>> 0. Attempting port 1.
>>>>>>>>> 16/03/09 18:26:57 WARN Utils: Service 'sparkDriver' could
not bind on port
>>>>>>>>> 0. Attempting port 1.
>>>>>>>>> 16/03/09 18:26:57 WARN Utils: Service 'sparkDriver' could
not bind on port
>>>>>>>>> 0. Attempting port 1.
>>>>>>>>> 16/03/09 18:26:57 ERROR SparkContext: Error initializing
SparkContext.
>>>>>>>>> java.net.BindException: Can't assign requested address:
Service
>>>>>>>>> 'sparkDriver' failed after 16 retries!
>>>>>>>>> at sun.nio.ch.Net.bind0(Native Method)
>>>>>>>>> at sun.nio.ch.Net.bind(Net.java:433)
>>>>>>>>> at sun.nio.ch.Net.bind(Net.java:425)
>>>>>>>>> at
>>>>>>>>> sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
>>>>>>>>> at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
>>>>>>>>> at
>>>>>>>>> io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
>>>>>>>>> at
>>>>>>>>> io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
>>>>>>>>> at
>>>>>>>>> io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
>>>>>>>>> at
>>>>>>>>> io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
>>>>>>>>> at
>>>>>>>>> io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
>>>>>>>>> at
>>>>>>>>> io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
>>>>>>>>> at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
>>>>>>>>> at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
>>>>>>>>> at
>>>>>>>>> io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
>>>>>>>>> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
>>>>>>>>> at
>>>>>>>>> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
>>>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>>>> java.net.BindException: Can't assign requested address:
Service
>>>>>>>>> 'sparkDriver' failed after 16 retries!
>>>>>>>>> at sun.nio.ch.Net.bind0(Native Method)
>>>>>>>>> at sun.nio.ch.Net.bind(Net.java:433)
>>>>>>>>> at sun.nio.ch.Net.bind(Net.java:425)
>>>>>>>>> at
>>>>>>>>> sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
>>>>>>>>> at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
>>>>>>>>> at
>>>>>>>>> io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
>>>>>>>>> at
>>>>>>>>> io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
>>>>>>>>> at
>>>>>>>>> io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
>>>>>>>>> at
>>>>>>>>> io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
>>>>>>>>> at
>>>>>>>>> io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
>>>>>>>>> at
>>>>>>>>> io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
>>>>>>>>> at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
>>>>>>>>> at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
>>>>>>>>> at
>>>>>>>>> io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
>>>>>>>>> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
>>>>>>>>> at
>>>>>>>>> io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
>>>>>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>>>>>> 
>>>>>>>>> java.lang.NullPointerException
>>>>>>>>> at
>>>>>>>>> org.apache.spark.sql.SQLContext$.createListenerAndUI(SQLContext.scala:1367)
>>>>>>>>> at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101)
>>>>>>>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
>>>>>>>>> at
>>>>>>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>>>>>>>>> at
>>>>>>>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>>>>>>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>>>>>>>>> at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
>>>>>>>>> at $iwC$$iwC.<init>(<console>:15)
>>>>>>>>> at $iwC.<init>(<console>:24)
>>>>>>>>> at <init>(<console>:26)
>>>>>>>>> at .<init>(<console>:30)
>>>>>>>>> at .<clinit>(<console>)
>>>>>>>>> at .<init>(<console>:7)
>>>>>>>>> at .<clinit>(<console>)
>>>>>>>>> at $print(<console>)
>>>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
Method)
>>>>>>>>> at
>>>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>>>>>>> at
>>>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>>> at java.lang.reflect.Method.invoke(Method.java:497)
>>>>>>>>> at
>>>>>>>>> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
>>>>>>>>> at
>>>>>>>>> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
>>>>>>>>> at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
>>>>>>>>> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>>>>>>>>> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>>>>>>>>> at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
>>>>>>>>> at
>>>>>>>>> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
>>>>>>>>> at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
>>>>>>>>> at
>>>>>>>>> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:132)
>>>>>>>>> at
>>>>>>>>> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
>>>>>>>>> at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
>>>>>>>>> at
>>>>>>>>> org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
>>>>>>>>> at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
>>>>>>>>> at
>>>>>>>>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
>>>>>>>>> at
>>>>>>>>> org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
>>>>>>>>> at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
>>>>>>>>> at
>>>>>>>>> org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
>>>>>>>>> at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
>>>>>>>>> at
>>>>>>>>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
>>>>>>>>> at
>>>>>>>>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>>>>>>>>> at
>>>>>>>>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>>>>>>>>> at
>>>>>>>>> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>>>>>>>>> at
>>>>>>>>> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
>>>>>>>>> at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
>>>>>>>>> at org.apache.spark.repl.Main$.main(Main.scala:31)
>>>>>>>>> at org.apache.spark.repl.Main.main(Main.scala)
>>>>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
Method)
>>>>>>>>> at
>>>>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>>>>>>> at
>>>>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>>>>> at java.lang.reflect.Method.invoke(Method.java:497)
>>>>>>>>> at
>>>>>>>>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>>>>>>>>> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
>>>>>>>>> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
>>>>>>>>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>>>>>>>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>>>>>>> 
>>>>>>>>> <console>:16: error: not found: value sqlContext
>>>>>>>>> import sqlContext.implicits._
>>>>>>>>>        ^
>>>>>>>>> <console>:16: error: not found: value sqlContext
>>>>>>>>> import sqlContext.sql
>>>>>>>>>        ^
>>>>>>>>> 
>>>>>>>>> scala> 
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> --
>>>>>>>>> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-on-Mac-tp26397p26446.html
<http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-on-Mac-tp26397p26446.html>
>>>>>>>>> Sent from the Apache Spark User List mailing list archive
at Nabble.com <http://nabble.com/>.
>>>>>>>>> 
>>>>>>>>> ---------------------------------------------------------------------
>>>>>>>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
<mailto:user-unsubscribe@spark.apache.org>
>>>>>>>>> For additional commands, e-mail: user-help@spark.apache.org
<mailto:user-help@spark.apache.org>
>>>>>>>> 
>>>>>>>> 
>>>>>>>> ---------------------------------------------------------------------
>>>>>>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
<mailto:user-unsubscribe@spark.apache.org>
>>>>>>>> For additional commands, e-mail: user-help@spark.apache.org
<mailto:user-help@spark.apache.org>
>>>>> 
>>>>> 
>>>>> ---------------------------------------------------------------------
>>>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org <mailto:user-unsubscribe@spark.apache.org>
>>>>> For additional commands, e-mail: user-help@spark.apache.org <mailto:user-help@spark.apache.org>
>> 


Mime
View raw message