spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ilya Ganelin <>
Subject Num-executors and executor-cores overwritten by defaults
Date Wed, 22 Oct 2014 04:35:21 GMT
Hi all. Just upgraded our cluster to CDH 5.2 (with Spark 1.1) but now I can
no longer set the number of executors or executor-cores. No matter what
values I pass on the command line to spark they are overwritten by the
defaults. Does anyone have any idea what could have happened here? Running
on Spark 1.02 before I had no trouble.

Also I am able to launch the spark shell without these parameters being

View raw message