spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From RodrigoB <>
Subject Dynamically switching Nr of allocated core
Date Mon, 03 Nov 2014 13:17:44 GMT
Hi all,

I can't seem to find a clear answer on the documentation. 

Does the standalone cluster support dynamic assigment of nr of allocated
cores to an application once another app stops? 
I'm aware that we can have core sharding if we use Mesos between active
applications depending on the nr of parallel tasks I believe my question is
slightly simpler. 

For example:
1 - There are 12 cores available in the cluster
2 - I start app A with 2 cores - gets 2
3 - I start app B - gets remaining 10
4 - If I stop app A, app B *does not* get the now available remaining 2

Should I expect Mesos to have this scenario working?

Also, the same question applies to when we add more cores to a cluster.
Let's say ideally I want 12 cores for my app, although there are only 10. As
I add more workers, they should get assigned to my app dynamically. I
haven't tested this in a while but I think the app will not even start and
complain about not enough resources...

Would very much appreciate any knowledge share on this!


View this message in context:
Sent from the Apache Spark User List mailing list archive at

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message