spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Andrew Or <and...@databricks.com>
Subject Re: Spark Shell stuck on standalone mode
Date Tue, 20 May 2014 23:55:29 GMT
Hi Sid,

There are two ports for the Spark master. One is the UI port, which is 8080
by default. The other is the port at which workers connect to the master.
This is 7077 by default.

It looks like you changed your UI port from 8080 to 38955. However, the
port that the master listens for is still 7077, unless you manually
configured SPARK_MASTER_PORT too.

Can you try "MASTER=spark://<master_node>:7077" instead?

Andrew


2014-05-20 16:39 GMT-07:00 Sidharth Kashyap <sidharth.n.kashyap@outlook.com>
:

> Hello,
>
> Thanks for the help.
>
> I have the port number changed from 8080 to 38955 to avoid port number
> conflicts.
>
> I checked the web UI and confirmed that the master and slave nodes are
> alive.
>
> Still stuck :(
>
> Will the change in port number cause any problem, is there any possibility
> of a hardcoded port number underneath?
>
> Thanks,
> Sid
>
> ------------------------------
> Date: Mon, 19 May 2014 09:29:09 +0800
> From: anishsneh@yahoo.co.in
>
> Subject: Re: Spark Shell stuck on standalone mode
> To: user@spark.apache.org
>
>
> Sid, also check using JPS if worker and master both are up and running
> .
> Master's web UI should be at http://localhost:8080 by default
>
> Thanks & regards
> --
> Anish Sneh
> http://in.linkedin.com/in/anishsneh
>
>  ------------------------------
> * From: * Andrew Ash <andrew@andrewash.com>;
> * To: * <user@spark.apache.org>;
> * Subject: * Re: Spark Shell stuck on standalone mode
> * Sent: * Mon, May 19, 2014 1:18:45 AM
>
>   Usually a spark:// URL is on port 7077 not 38955. Can you make sure
> you're using the same URL as what appears in the webui on the Spark master?
>
> MASTER=spark://<master_node>:*7077* $SPARK_HOME/bin/spark-shell
>
>
> Here is a screenshot of where the URL should appear in the Spark master
> webui.  It's the first row, labeled "URL".
>
> [image: Inline image 1]
>
> Cheers!
> Andrew
>
>
>
> On Sun, May 18, 2014 at 4:02 PM, Sidharth Kashyap <
> sidharth.n.kashyap@outlook.com> wrote:
>
> Hi,
>
> I have configured a cluster with 10 slaves and one master.
>
> The master web portal shows all the slaves and looks to be rightly
> configured.
>
> I started the master node with the command
>
> MASTER=spark://<master_node>:38955 $SPARK_HOME/bin/spark-shell
>
> This brings in the REPL with the following message though
>
> " 14/05/18 23:34:39 ERROR AppClient$ClientActor: Master removed our
> application: FAILED; stopping client"
>
> scala> val textFile = sc.textFile("CHANGES.txt")
> textFile: org.apache.spark.rdd.RDD[String] = MappedRDD[1] at textFile at
> <console>:12
>
> scala> textFile.count()
>
> and the control never comes out of the REPL as shown in the attachment.
>
> Am I doing something wrong?
>
> Please help
>
> Regards,
> Sid
>
>
>

Mime
View raw message