spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sachit Murarka <connectsac...@gmail.com>
Subject Streaming job taking all executors
Date Sun, 13 Dec 2020 16:36:16 GMT
Hi All,

I am using Standalone Spark.

I am using dynamic memory allocation. Despite giving max executors, min
executors and initial executors, my  streaming job is taking all executors
available in the cluster. Could anyone please suggest what can be wrong
here?

Please note source is Kafka.

I feel this can be avoided by setting max cores per application. But why
this is happening if max executors is set and also what can be other best
ways to avoid that.

Thanks,
Sachit

Mime
View raw message