spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Benny Thompson <>
Subject Connection Refused When Running SparkPi Locally
Date Sat, 01 Mar 2014 02:18:01 GMT
I'm trying to run a simple execution of the SparkPi example.  I started the
master and one worker, then executed the job on my local "cluster", but end
up getting a sequence of errors all ending with

"Caused by:
Connection refused: /"

I originally tried running my master and worker without configuration but
ended up with the same error.  I tried to change to to test if it
was maybe just a firewall issue since the server is locked down from the
outside world.

My conf/ contains the following:

Here is the order and commands I run:
1) "sbin/" (to start the master)
2) "bin/spark-class org.apache.spark.deploy.worker.Worker spark:// --ip --port 1111" (in a different session on the
same machine to start the slave)
3) "bin/run-example org.apache.spark.examples.SparkPi spark://"
(in a different session on the same machine to start the job)

I find it hard to believe that I'm locked down enough that running locally
would cause problems.

Any help is greatly appreciated!


View raw message