spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From davidkl <davidkl...@hotmail.com>
Subject Re: Spark Streaming scheduling control
Date Mon, 20 Oct 2014 10:06:59 GMT
Thanks Akhil Das-2: actually I tried setting spark.default.parallelism but no
effect :-/

I am running standalone and performing a mix of map/filter/foreachRDD. 

I had to force parallelism with repartition to get both workers to process
tasks, but I do not think this should be required (and I am not sure it's
not optimal). As I mentioned, without forcing it with repartition, there are
scheduled tasks on the queue that continue accumulating over time, so I
would expect Spark should assigning those to idle workers. Is my assumption
wrong? :-)

Thanks



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-scheduling-control-tp16778p16805.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message