spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Gen <gen.tan...@gmail.com>
Subject --executor-cores cannot change vcores in yarn?
Date Sat, 01 Nov 2014 10:15:11 GMT
Hi,

Maybe it is a stupid question, but I am running spark on yarn. I request the
resources by the following command:
{code}
./spark-submit --master yarn-client --num-executors #number of worker
--executor-cores #number of cores. ...
{code}
However, after launching the task, I use /yarn node -status ID / to monitor
the situation of cluster. It shows that the number of Vcores used for each
container is always 1 no matter what number I pass by --executor-cores. 
Any ideas how to solve this problem? Thanks a lot in advance for your help.

Cheers
Gen




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/executor-cores-cannot-change-vcores-in-yarn-tp17883.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message