spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From chandan prakash <chandanbaran...@gmail.com>
Subject spark.streaming.concurrentJobs parameter in Spark Streaming
Date Sun, 01 May 2016 02:29:38 GMT
I have a doubt regarding this spark.streaming.concurrentJobs parameter.

Setting
 spark.streaming.concurrentJobs =2
 i can see 2 parallel jobs getting processed.

But setting
spark.streaming.concurrentJobs =10,
 *i can still see only 4-5 concurrent jobs getting processed and other jobs
getting queued while many cores in my machine remain ideal*.
I have 2 machines of 24 cores each for spark processing.
*Any reason why jobs getting queued when concurrent jobs parameter allows
more jobs and there are available cores as well ?*


‚Äč


Thanks,

-- 
Chandan Prakash

Mime
View raw message