The core is equivalent to the number of logical processeors
cat /proc/cpuinfo|grep processor|wc -l
That will tell you how many logical processors you have.
I gave an explanation for this a while back. As you are running in Standalone mode, this is my take:
Standalone modeResources are managed by Spark resource manager itself. You start your master and slaves/worker processes As far as I have worked it out the following appliesnum-executors --> It does not care about this. The number of executors will be the number of workers on each node
Dr Mich Talebzadeh
Disclaimer: Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction.
On 3 November 2016 at 04:48, map reduced <firstname.lastname@example.org> wrote:Hi,I am noticing that when there are N cores per executor, each executor only starts N threads to process the data (so 1 thread per core). Is there a way to increase more than N threads, i.e. say N+m threads per core?So I assigned 7 cores/executor, so I see 7 Active Tasks at all times.And only 7 threads doing all the work:Is there any way to make it atleast 2 threads/core?P.S.: Long running Streaming job, Standalone 2.0.0 clusterThanks,KP