spot-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mark Grover <m...@apache.org>
Subject Re: Number of executors and dynamic allocation
Date Mon, 03 Apr 2017 19:34:59 GMT
On Mon, Apr 3, 2017 at 12:27 PM, Smith, Nathanael P <
nathanael.p.smith@intel.com> wrote:

> I agree,
>
> There are a few things in ml_ops that need to be rethought.
> For this instance we could add a line in spot.conf,
> i.e. SPK_DYNAMIC_ALLOCATION=true
>
> in ml_ops you could do something like:
> if SPK_DYNAMIC_ALLOCATION == true then
>         "--conf spark.dynamicAllocation.maxExecutors=${SPK_EXEC}  --conf
> spark.dynamicAllocation.enabled=true”
> else
>         “--num-executors ${SPK_EXEC}”
>
> i’m not sure if placement for both can be in the same position when
> calling spark-submit, that could be a concern.
> Also can you set maxExecutors without enabling dynamicAllocation?
>
Nope. When using dynamic allocation, it's a static number of executors
throughout the job, no max or min. If you specify the max value config,
it'd be either ignored or result in an error (likely just ignored - haven't
tested).

>
> - Nathanael
>
>
>
> > On Apr 3, 2017, at 8:19 AM, Barona, Ricardo <ricardo.barona@intel.com>
> wrote:
> >
> > I created this bug in JIRA: https://issues.apache.org/
> jira/browse/SPOT-136
> > We can continue discussing the details in there.
> >
> > Thanks.
> >
> > On 4/3/17, 10:12 AM, "Barona, Ricardo" <ricardo.barona@intel.com> wrote:
> >
> >    Completely not a silly question, it’s actually a really good
> observation. I think we didn’t update correctly the ml_ops.sh script when
> we added these parameters.
> >
> >    What we could start discussing is whether we want dynamic allocation
> or if we want fixed number of executors. I’m going to let the mic open to
> see what people think about this.
> >
> >    Thanks Giacomo.
> >
> >    On 4/3/17, 3:29 AM, "Giacomo Bernardi" <mino@minux.it> wrote:
> >
> >        Hi,
> >        hope this is not a silly question. In ml_ops.sh there are:
> >          --num-executors ${SPK_EXEC} \
> >        and:
> >          --conf spark.dynamicAllocation.enabled=true \
> >
> >        which trigger the warning:
> >          WARN spark.SparkContext: Dynamic Allocation and num executors
> both
> >        set, thus dynamic allocation disabled.
> >
> >        Shouldn't we remove the "--num-executors" and add instead:
> >          --conf spark.dynamicAllocation.maxExecutors=${SPK_EXEC} \
> >        ?
> >
> >        Thanks.
> >        Giacomo
> >
> >
> >
> >
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message