spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From John Fang <>
Subject Why can per task‘s memory only reach 1 / numTasks , not greater than 1 / numTasks in ExecutionMemoryPool ?
Date Tue, 05 Jun 2018 17:13:03 GMT
In fact not all tasks belong to the same stage. Thus, per task may be is
deferent for the dependence of memory. For example, the executor
 are running two tasks(A and B), and the ExecutionMemoryPool own 1000M. We
can hope the task-A occupy 900M, and task-B occupy 100M due to the task-A
need much more memory when it is running.



John Fang

View raw message