spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Andrew Ash <and...@andrewash.com>
Subject Re: Spark Shell stuck on standalone mode
Date Mon, 19 May 2014 01:18:45 GMT
Usually a spark:// URL is on port 7077 not 38955. Can you make sure you're
using the same URL as what appears in the webui on the Spark master?

MASTER=spark://<master_node>:*7077* $SPARK_HOME/bin/spark-shell


Here is a screenshot of where the URL should appear in the Spark master
webui.  It's the first row, labeled "URL".

[image: Inline image 1]

Cheers!
Andrew



On Sun, May 18, 2014 at 4:02 PM, Sidharth Kashyap <
sidharth.n.kashyap@outlook.com> wrote:

> Hi,
>
> I have configured a cluster with 10 slaves and one master.
>
> The master web portal shows all the slaves and looks to be rightly
> configured.
>
> I started the master node with the command
>
> MASTER=spark://<master_node>:38955 $SPARK_HOME/bin/spark-shell
>
> This brings in the REPL with the following message though
>
> " 14/05/18 23:34:39 ERROR AppClient$ClientActor: Master removed our
> application: FAILED; stopping client"
>
> scala> val textFile = sc.textFile("CHANGES.txt")
> textFile: org.apache.spark.rdd.RDD[String] = MappedRDD[1] at textFile at
> <console>:12
>
> scala> textFile.count()
>
> and the control never comes out of the REPL as shown in the attachment.
>
> Am I doing something wrong?
>
> Please help
>
> Regards,
> Sid
>

Mime
View raw message