mahout-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Dmitriy Lyubimov <dlie...@gmail.com>
Subject Re: Problem Starting Spark Shell
Date Thu, 09 Jul 2015 22:22:51 GMT
I don't know. seems like somebody is sitting on the port. `lsof` utility
may help to figure what it is.

On Wed, Jul 8, 2015 at 8:18 AM, Parimi Rohit <rohit.parimi@gmail.com> wrote:

> Hi Dimitry,
>
> Please find my answers inline.
>
>
> On Tue, Jul 7, 2015 at 7:48 PM, Dmitriy Lyubimov <dlieu.7@gmail.com>
> wrote:
>
> > this settings are for spark. spark shell only needs master (which is by
> > default local), `MASTER` variable.
> >
> > Although. Your error indicates that it does try to go somewhere. are you
> > able to run regular spark shell?
> >
>
> Yes. I was able to start the spark shell and everything looks fine. I ran
> the lineCount example using the shell and it gave me the correct results.
>
>
> > in the head of 0.10.x branch  you can specify additional spark properties
> > in MAHOUT_OPTS  via -D<spark-property>=<value> if you need to tweak
> > something.
> >
>
> I followed the instructions at
> http://mahout.apache.org/users/sparkbindings/play-with-shell.html and
> cloned mahout using the following command:
>
> git clone https://github.com/apache/mahout mahout
>
> Also, as you suggested I tried using the MAHOUT_OPTS with different
> key-value pairs.
>
> MAHOUT_OPTS="$MAHOUT_OPTS -Dmaster=spark://localhost:7077"
>
> and
>
> MAHOUT_OPTS="$MAHOUT_OPTS -DMASTER=http://localhost:8080/spark
>
> and
>
> MAHOUT_OPTS="$MAHOUT_OPTS -DSPARK_MASTER_IP=localhost"
>
> But I still get the error:
>
> java.net.BindException: Failed to bind to:
> rohitp-mac00.mot.com/100.64.159.30:0: Service 'sparkDriver' failed after
> 16
> retries!
>
> at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)
>
> at
>
> akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(NettyTransport.scala:391)
>
> at
>
> akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(NettyTransport.scala:388)
>
> at scala.util.Success$$anonfun$map$1.apply(Try.scala:206)
>
> at scala.util.Try$.apply(Try.scala:161)
>
> at scala.util.Success.map(Try.scala:206)
>
> at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
>
> at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
>
> at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
>
> at
>
> akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(BatchingExecutor.scala:67)
>
> at
>
> akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:82)
>
> at
>
> akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
>
> at
>
> akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
>
> at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
>
> at akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:58)
>
> at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:42)
>
> at
>
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
>
> at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>
> at
>
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>
> at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>
> at
>
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>
> Not sure from where it is getting the ip address. Anything else that I can
> try??
>
> Thanks,
> Rohit
>
>
> > also. if you are running on mac, we don't test on it specifically. linux
> > only.
> >
> > On Tue, Jul 7, 2015 at 3:08 PM, Parimi Rohit <rohit.parimi@gmail.com>
> > wrote:
> >
> > > Hi All,
> > >
> > > I am trying to start the spark shell and follow instruction on the
> > > following page:
> > >
> > > http://mahout.apache.org/users/sparkbindings/play-with-shell.html
> > >
> > > I installed spark and created the spark-env.sh file with the following
> > > content.
> > >
> > > export SPARK_LOCAL_IP=localhost
> > >
> > > export SPARK_MASTER_IP=localhost
> > >
> > > export SPARK_MASTER_PORT=7077
> > >
> > > export SPARK_WORKER_CORES=2
> > >
> > > export SPARK_WORKER_MEMORY=2g
> > >
> > > export SPARK_WORKER_INSTANCES=2
> > >
> > >
> > > Also, I added the following variables to my profile file:
> > >
> > >
> > > export MAHOUT_HOME=$HOME/Documents/Softwares/Mahout_Master/mahout/
> > >
> > > export SPARK_HOME=$HOME/Documents/Softwares/spark-1.1.1/
> > >
> > > export MASTER=spark://localhost:7077
> > >
> > >
> > > However, when I start the spark shell using the command:
> > >
> > >
> > > bin/mahout spark-shell
> > >
> > >
> > > I get a long error message related to connection failed (google drive
> > link
> > > to error file) for an ip address. I am not sure where it is getting the
> > ip
> > > address and how to force it to use localhost instead of the ip address.
> > Any
> > > help is much appreciated
> > >
> > > Thanks,
> > > Rohit
> > >
> > > ​
> > >  MahoutScala_Error.rtf
> > > <
> > >
> >
> https://drive.google.com/file/d/0B5oLnkSdJzQVS1RoTGZsdFJ5aEE/view?usp=drive_web
> > > >
> > > ​
> > >
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message