spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
Subject Re: Dynamically change executors settings
Date Sat, 27 Aug 2016 03:11:55 GMT

No, currently you can't change the setting. 

// maropu

2016/08/27 11:40、Vadim Semenov <> のメッセージ:

> Hi spark users,
> I wonder if it's possible to change executors settings on-the-fly.
> I have the following use-case: I have a lot of non-splittable skewed files in a custom
format that I read using a custom Hadoop RecordReader. These files can be small & huge
and I'd like to use only one-two cores per executor while they get processed (to use the whole
heap). But once they got processed I'd like to enable all cores.
> I know that I can achieve this by splitting it into two separate jobs but I wonder if
it's possible to somehow achieve the behavior I described.
> Thanks!

To unsubscribe e-mail:

View raw message