Each partition should be translated into one task which should run in one executor. But one executor can process more than one task. I may be wrong, and will be grateful if someone can correct me.


On Wed, Apr 4, 2018 at 8:13 PM, Thodoris Zois <zois@ics.forth.gr> wrote:

Hello list!

I am trying to familiarize with Apache Spark. I  would like to ask something about partitioning and executors.

Can I have e.g: 500 partitions but launch only one executor that will run operations in only 1 partition of the 500? And then I would like my job to die.

Is there any easy way? Or i have to modify code to achieve that?

Thank you,

To unsubscribe e-mail: user-unsubscribe@spark.apache.org