spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Kiran Biswal <biswalki...@gmail.com>
Subject Spark DStream application memory leak debugging
Date Sat, 25 Sep 2021 09:01:48 GMT
Hello Experts

I have a spark streaming application(DStream). I use spark 3.0.2, scala
2.12 This application reads about 20 different kafka topics and produces a
single stream and I filter the RDD per topic and store in cassandra

I see that there is a steady increase in executor memory over the hours
until it reaches max allocated memory and then it stays  at that value. No
matter how high I allocate to the executor this pattern is seen. I suspect
memory leak

Any guidance you may be able provide as to how to debug will be highly
appreciated

Thanks in advance
Regards
Kiran

Mime
View raw message