spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sidharth Kashyap <>
Subject Spark Shell stuck on standalone mode
Date Sun, 18 May 2014 23:02:06 GMT
I have configured a cluster with 10 slaves and one master.
The master web portal shows all the slaves and looks to be rightly configured.
I started the master node with the command
MASTER=spark://<master_node>:38955 $SPARK_HOME/bin/spark-shell
This brings in the REPL with the following message though
" 14/05/18 23:34:39 ERROR AppClient$ClientActor: Master removed our application: FAILED; stopping
scala> val textFile = sc.textFile("CHANGES.txt")textFile: org.apache.spark.rdd.RDD[String]
= MappedRDD[1] at textFile at <console>:12
scala> textFile.count()
and the control never comes out of the REPL as shown in the attachment.
Am I doing something wrong?
Please help
  • Unnamed multipart/mixed (inline, None, 0 bytes)
View raw message