The Spark version I am using is 2.10. The language is Scala. This is running in standalone cluster mode.
Each worker is able to use all physical CPU cores in the cluster as is the default case.
I was using the following parameters to spark-submit
--conf spark.executor.cores=1 --conf spark.default.parallelism=32
Later, I read that the term "cores" doesn't mean physical CPU cores but rather #tasks that an executor can execute.
Anyway, I don't have a clear idea how to set the number of executors per physical node. I see there's an option in the Yarn mode, but it's not available for standalone cluster mode.