[ https://issues.apache.org/jira/browse/SPARK-19782?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15890011#comment-15890011
]
Sean Owen commented on SPARK-19782:
-----------------------------------
I'm not clear what this means? Spark apps are configured to request a certain amount of resource
from the resource manager. They can already tailor their usage with dynamic allocation. The
resource manager is the thing in charge of how much resource exists and how much to give to
an app.
> Spark query available cores from application
> --------------------------------------------
>
> Key: SPARK-19782
> URL: https://issues.apache.org/jira/browse/SPARK-19782
> Project: Spark
> Issue Type: New Feature
> Components: Spark Core
> Affects Versions: 2.1.0
> Reporter: Tom Lewis
>
> It might be helpful for Spark jobs to self regulate resources if they could query how
many cores exist on a executing system not just how many are being used at a given time.
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org
|