spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Andrew Milkowski <amgm2...@gmail.com>
Subject freeing up memory occupied by processed Stream Blocks
Date Thu, 19 Jan 2017 18:17:49 GMT
hello

using spark 2.0.2  and while running sample streaming app with kinesis
noticed (in admin ui Storage tab)  "Stream Blocks" for each worker keeps
climbing up

then also (on same ui page) in Blocks section I see blocks such as below

input-0-1484753367056

that are marked as Memory Serialized

that do not seem to be "released"

above eventually consumes executor memories leading to out of memory
exception on some

is there a way to "release" these blocks free them up , app is sample m/r

I attempted rdd.unpersist(false) in the code but that did not lead to
memory free up

thanks much in advance!

Mime
View raw message