spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From 15313776907 <15313776...@163.com>
Subject Re: Core allocation is scattered
Date Fri, 26 Jul 2019 01:35:42 GMT
This may be within your yarn constraints, but you can look at the configuration parameters
of your yarn


On 7/25/2019 20:23,Amit Sharma<resolve123@gmail.com> wrote:
I have cluster with 26 nodes having 16 cores on each. I am running a spark job with 20 cores
but i did not understand why my application get 1-2 cores on couple of machines why not it
just run on two nodes like node1=16 cores and node 2=4 cores . but cores are allocated like
node1=2 node =1---------node 14=1 like that. Is there any conf property i need to change.
I know with dynamic allocation we can use below but without dynamic allocation is there any?
--conf "spark.dynamicAllocation.maxExecutors=2"





Thanks
Amit
Mime
View raw message