spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From abhiguruvayya <>
Subject Executors not utilized properly.
Date Tue, 17 Jun 2014 16:36:59 GMT
I am creating around 10 executors with 12 cores and 7g memory, but when i
launch a task not all executors are being used. For example if my job has 9
tasks, only 3 executors are being used with 3 task each and i believe this
is making my app slower than map reduce program for the same use case. Can
any one throw some light on executor configuration if any?How can i use all
the executors. I am running spark on yarn and Hadoop 2.4.0.

View this message in context:
Sent from the Apache Spark User List mailing list archive at

View raw message