spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From kant kodali <>
Subject Spark on YARN not utilizing all the YARN containers available
Date Tue, 09 Oct 2018 17:20:36 GMT
Hi All,

I am using Spark 2.3.1 and using YARN as a cluster manager.

I currently got

1) 6 YARN containers(executors=6) with 4 executor cores for each container.
2) 6 Kafka partitions from one topic.
3) You can assume every other configuration is set to whatever the default
values are.

Spawned a Simple Streaming Query and I see all the tasks get scheduled on
one YARN container. am I missing any config?


View raw message