spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Vadim Semenov <vadim.seme...@datadoghq.com>
Subject Dynamically change executors settings
Date Sat, 27 Aug 2016 02:40:02 GMT
Hi spark users,

I wonder if it's possible to change executors settings on-the-fly.
I have the following use-case: I have a lot of non-splittable skewed files
in a custom format that I read using a custom Hadoop RecordReader. These
files can be small & huge and I'd like to use only one-two cores per
executor while they get processed (to use the whole heap). But once they
got processed I'd like to enable all cores.
I know that I can achieve this by splitting it into two separate jobs but I
wonder if it's possible to somehow achieve the behavior I described.

Thanks!

Mime
View raw message