spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ian <>
Subject Re: List of questios about spark
Date Mon, 30 May 2016 09:11:38 GMT
No, the limit is given by your setup. If you use Spark on a YARN cluster,
then the number of concurrent jobs is really limited to the resources
allocated to each job and how the YARN queues are set up. For instance, if
you use the FIFO scheduler (default), then it can be the case that the first
job takes up all the resources and all the others have to wait until the job
is done. If, on the other hand, you use the FAIR scheduler, then the number
of jobs that run concurrently is limited just by what's available on the
cluster in terms of resources.

View this message in context:
Sent from the Apache Spark User List mailing list archive at

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message