spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Hao Wang <wh.s...@gmail.com>
Subject BUG? Why does MASTER have to be set to spark://hostname:port?
Date Fri, 13 Jun 2014 14:27:26 GMT
Hi, all

When I try to run Spark PageRank using:

./bin/spark-submit \
--master spark://192.168.1.12:7077 \
--class org.apache.spark.examples.bagel.WikipediaPageRank \
~/Documents/Scala/WikiPageRank/target/scala-2.10/wikipagerank_2.10-1.0.jar \
hdfs://192.168.1.12:9000/freebase-13G 0.000005 100 True

*I don't specify the Spark Master by SparkContext.setMaster() in PageRank
code.*

Unfortunately, it hanged on here:

14/06/13 22:09:43 INFO DAGScheduler: Submitting 104 missing tasks from
Stage 0 (MappedRDD[1] at textFile at WikipediaPageRank.scala:59)
14/06/13 22:09:43 INFO TaskSchedulerImpl: Adding task set 0.0 with 104 tasks
14/06/13 22:09:58 WARN TaskSchedulerImpl: Initial job has not accepted
any resources; check your cluster UI to ensure that workers are
registered and have sufficient memory

But after I change --master to hostname:7077, it works normally.

Regards,
Wang Hao(王灏)

CloudTeam | School of Software Engineering
Shanghai Jiao Tong University
Address:800 Dongchuan Road, Minhang District, Shanghai, 200240
Email:wh.sjtu@gmail.com

Mime
View raw message