spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Gen <>
Subject --executor-cores cannot change vcores in yarn?
Date Sat, 01 Nov 2014 10:15:11 GMT

Maybe it is a stupid question, but I am running spark on yarn. I request the
resources by the following command:
./spark-submit --master yarn-client --num-executors #number of worker
--executor-cores #number of cores. ...
However, after launching the task, I use /yarn node -status ID / to monitor
the situation of cluster. It shows that the number of Vcores used for each
container is always 1 no matter what number I pass by --executor-cores. 
Any ideas how to solve this problem? Thanks a lot in advance for your help.


View this message in context:
Sent from the Apache Spark User List mailing list archive at

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message