spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From cjwang ...@cjwang.us>
Subject Creating an RDD in another RDD causes deadlock
Date Tue, 02 Sep 2014 21:14:48 GMT
My code seemed deadlock when I tried to do this:

object MoreRdd extends Serializable {
	def apply(i: Int) = {
		val rdd2 = sc.parallelize(0 to 10)
		rdd2.map(j => i*10 + j).collect
	}
}

val rdd1 = sc.parallelize(0 to 10)
val y = rdd1.map(i => MoreRdd(i)).collect
	  
y.toString()


It never reached the last line.  The code seemed deadlock somewhere since my
CPU load was quite low.

Is there a restriction not to create an RDD while another one is still
active?  Is it because one worker can only handle one task?  How do I work
around this?





--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Creating-an-RDD-in-another-RDD-causes-deadlock-tp13302.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message