How many files do you read ? Are they splittable ?
If you have 4 files non splittable, your dataset would have 4 partitions and you will only see one task per partition handle by on executor


On Tue, May 28, 2019 at 10:06 AM Sachit Murarka <connectsachit@gmail.com> wrote:
Hi All,

I am using spark 2.2
I have enabled spark dynamic allocation with executor cores 4, driver cores 4 and executor memory 12GB driver memory 10GB.

In Spark UI, I see only 1 task is launched per executor.

Could anyone please help on this?

Kind Regards,
Sachit Murarka