spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From cjwang>
Subject Creating an RDD in another RDD causes deadlock
Date Tue, 02 Sep 2014 21:14:48 GMT
My code seemed deadlock when I tried to do this:

object MoreRdd extends Serializable {
	def apply(i: Int) = {
		val rdd2 = sc.parallelize(0 to 10) => i*10 + j).collect

val rdd1 = sc.parallelize(0 to 10)
val y = => MoreRdd(i)).collect

It never reached the last line.  The code seemed deadlock somewhere since my
CPU load was quite low.

Is there a restriction not to create an RDD while another one is still
active?  Is it because one worker can only handle one task?  How do I work
around this?

View this message in context:
Sent from the Apache Spark User List mailing list archive at

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message