The code is very simple, just a couple of lines. When i lanch it runs in local but not in cluster.

sc = SparkContext("local", "Tech Companies Feedback")

beginning_time =


print - beginning_time


Thanks for your interest,


On Fri, Jul 31, 2015 at 4:24 AM, Marcelo Vanzin <> wrote:
Can you share the part of the code in your script where you create the SparkContext instance?

On Thu, Jul 30, 2015 at 7:19 PM, fordfarline <> wrote:
Hi All,

I`m having an issue when lanching an app (python) against a stand alone
cluster, but runs in local, as it doesn't reach the cluster.
It's the first time i try the cluster, in local works ok.

i made this:

-> /home/user/Spark/spark-1.3.0-bin-hadoop2.4/sbin/ # Master and
worker are up in localhost:8080/4040
-> /home/user/Spark/spark-1.3.0-bin-hadoop2.4/bin/spark-submit --master
           * The script runs ok but in local :(    i can check it in
localhost:4040, but i don't see any job in cluster UI

The only warning it's:
WARN Utils: Your hostname, localhost resolves to a loopback address:; using instead (on interface eth0)

I set SPARK_LOCAL_IP= to solve this, al least de warning disappear,
but the script keep executing in local not in cluster.

I think it has something to do with my virtual server:
-> Host Server: Linux Mint
-> The Virtual Server (workstation 10) where runs Spark is Linux Mint as

Any ideas what am i doing wrong?

Thanks in advance for any suggestion, i getting mad on it!!

View this message in context:
Sent from the Apache Spark User List mailing list archive at

To unsubscribe, e-mail:
For additional commands, e-mail: