spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sean Owen <sro...@gmail.com>
Subject Re: Spark DStream application memory leak debugging
Date Sat, 25 Sep 2021 12:28:09 GMT
It could be 'normal' - executors won't GC unless they need to.
It could be state in your application, if you're storing state.
You'd want to dump the heap to take a first look

On Sat, Sep 25, 2021 at 7:24 AM Kiran Biswal <biswalkiran@gmail.com> wrote:

> Hello Experts
>
> I have a spark streaming application(DStream). I use spark 3.0.2, scala
> 2.12 This application reads about 20 different kafka topics and produces a
> single stream and I filter the RDD per topic and store in cassandra
>
> I see that there is a steady increase in executor memory over the hours
> until it reaches max allocated memory and then it stays  at that value. No
> matter how high I allocate to the executor this pattern is seen. I suspect
> memory leak
>
> Any guidance you may be able provide as to how to debug will be highly
> appreciated
>
> Thanks in advance
> Regards
> Kiran
>

Mime
View raw message