spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From davidkl <>
Subject Re: Spark Streaming scheduling control
Date Mon, 20 Oct 2014 10:06:59 GMT
Thanks Akhil Das-2: actually I tried setting spark.default.parallelism but no
effect :-/

I am running standalone and performing a mix of map/filter/foreachRDD. 

I had to force parallelism with repartition to get both workers to process
tasks, but I do not think this should be required (and I am not sure it's
not optimal). As I mentioned, without forcing it with repartition, there are
scheduled tasks on the queue that continue accumulating over time, so I
would expect Spark should assigning those to idle workers. Is my assumption
wrong? :-)


View this message in context:
Sent from the Apache Spark User List mailing list archive at

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message