spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Hao REN <julien19890...@gmail.com>
Subject How to balance task load
Date Mon, 02 Dec 2013 15:42:59 GMT
Hi,

When running some tests on EC2 with spark, I notice that: the tasks are not
fairly distributed to executor.

For example, a map action produces 4 tasks, but they all go to the


Executors (3)

   - *Memory:* 0.0 B Used (19.0 GB Total)
   - *Disk:* 0.0 B Used

Executor IDAddressRDD blocksMemory usedDisk usedActive tasksFailed
tasksComplete
tasksTotal tasks0ip-10-10-141-143.ec2.internal:5281600.0 B / 6.3 GB0.0 B4004
1ip-10-40-38-190.ec2.internal:6031400.0 B / 6.3 GB0.0 B00002
ip-10-62-35-223.ec2.internal:4050000.0 B / 6.3 GB0.0 B0000

Mime
View raw message