spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Benny Thompson <ben.d.tho...@gmail.com>
Subject Connection Refused When Running SparkPi Locally
Date Sat, 01 Mar 2014 02:18:01 GMT
I'm trying to run a simple execution of the SparkPi example.  I started the
master and one worker, then executed the job on my local "cluster", but end
up getting a sequence of errors all ending with

"Caused by:
akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2:
Connection refused: /127.0.0.1:39398"

I originally tried running my master and worker without configuration but
ended up with the same error.  I tried to change to 127.0.0.1 to test if it
was maybe just a firewall issue since the server is locked down from the
outside world.

My conf/spark-conf.sh contains the following:
export SPARK_MASTER_IP=127.0.0.1

Here is the order and commands I run:
1) "sbin/start-master.sh" (to start the master)
2) "bin/spark-class org.apache.spark.deploy.worker.Worker spark://
127.0.0.1:7077 --ip 127.0.0.1 --port 1111" (in a different session on the
same machine to start the slave)
3) "bin/run-example org.apache.spark.examples.SparkPi spark://127.0.0.1:7077"
(in a different session on the same machine to start the job)

I find it hard to believe that I'm locked down enough that running locally
would cause problems.

Any help is greatly appreciated!

Thanks,
Benny

Mime
View raw message