spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ian <psilonl...@gmail.com>
Subject Re: List of questios about spark
Date Mon, 30 May 2016 09:11:38 GMT
No, the limit is given by your setup. If you use Spark on a YARN cluster,
then the number of concurrent jobs is really limited to the resources
allocated to each job and how the YARN queues are set up. For instance, if
you use the FIFO scheduler (default), then it can be the case that the first
job takes up all the resources and all the others have to wait until the job
is done. If, on the other hand, you use the FAIR scheduler, then the number
of jobs that run concurrently is limited just by what's available on the
cluster in terms of resources.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/List-of-questios-about-spark-tp27027p27045.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message