spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Patrick Wendell (JIRA)" <>
Subject [jira] [Updated] (SPARK-1099) Allow inferring number of cores with local[*]
Date Mon, 07 Apr 2014 20:11:14 GMT


Patrick Wendell updated SPARK-1099:

    Summary: Allow inferring number of cores with local[*]  (was: Spark's local mode should
respect spark.cores.max by default)

> Allow inferring number of cores with local[*]
> ---------------------------------------------
>                 Key: SPARK-1099
>                 URL:
>             Project: Spark
>          Issue Type: Improvement
>          Components: Deploy
>            Reporter: Aaron Davidson
>            Assignee: Aaron Davidson
>            Priority: Minor
>             Fix For: 1.0.0
> It seems reasonable that the default number of cores used by spark's local mode (when
no value is specified) is drawn from the spark.cores.max configuration parameter (which, conveniently,
is now settable as a command-line option in spark-shell).
> For the sake of consistency, it's probable that this change would also entail making
the default number of cores when spark.cores.max is NOT specified to be as many logical cores
are on the machine (which is what standalone mode does). This too seems reasonable, as Spark
is inherently a distributed system and I think it's expected that it should use multiple cores
by default. However, it is a behavioral change, and thus requires caution.

This message was sent by Atlassian JIRA

View raw message