spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Dillon Dukek <dillon.du...@placed.com.INVALID>
Subject Re: Spark on YARN not utilizing all the YARN containers available
Date Tue, 09 Oct 2018 19:52:48 GMT
Can you send how you are launching your streaming process? Also what
environment is this cluster running in (EMR, GCP, self managed, etc)?

On Tue, Oct 9, 2018 at 10:21 AM kant kodali <kanth909@gmail.com> wrote:

> Hi All,
>
> I am using Spark 2.3.1 and using YARN as a cluster manager.
>
> I currently got
>
> 1) 6 YARN containers(executors=6) with 4 executor cores for each
> container.
> 2) 6 Kafka partitions from one topic.
> 3) You can assume every other configuration is set to whatever the default
> values are.
>
> Spawned a Simple Streaming Query and I see all the tasks get scheduled on
> one YARN container. am I missing any config?
>
> Thanks!
>

Mime
View raw message