spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Aaron Davidson (JIRA)" <j...@apache.org>
Subject [jira] [Resolved] (SPARK-1099) Allow inferring number of cores with local[*]
Date Mon, 07 Apr 2014 23:56:14 GMT

     [ https://issues.apache.org/jira/browse/SPARK-1099?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Aaron Davidson resolved SPARK-1099.
-----------------------------------

    Resolution: Fixed

> Allow inferring number of cores with local[*]
> ---------------------------------------------
>
>                 Key: SPARK-1099
>                 URL: https://issues.apache.org/jira/browse/SPARK-1099
>             Project: Spark
>          Issue Type: Improvement
>          Components: Deploy
>            Reporter: Aaron Davidson
>            Assignee: Aaron Davidson
>            Priority: Minor
>             Fix For: 1.0.0
>
>
> It seems reasonable that the default number of cores used by spark's local mode (when
no value is specified) is drawn from the spark.cores.max configuration parameter (which, conveniently,
is now settable as a command-line option in spark-shell).
> For the sake of consistency, it's probable that this change would also entail making
the default number of cores when spark.cores.max is NOT specified to be as many logical cores
are on the machine (which is what standalone mode does). This too seems reasonable, as Spark
is inherently a distributed system and I think it's expected that it should use multiple cores
by default. However, it is a behavioral change, and thus requires caution.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Mime
View raw message