spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Dillon Dukek <>
Subject Re: Troubleshooting Spark OOM
Date Wed, 09 Jan 2019 18:55:14 GMT
Hi William,

Just to get started, can you describe the spark version you are using and
the language? It doesn't sound like you are using pyspark, however,
problems arising from that can be different so I just want to be sure. As
well, can you talk through the scenario under which you are dealing with
this error? ie the order of operations for the transformations you are

However, if you're set on getting a heap dump, probably the easiest way
would be to just monitor an active application through the spark UI then go
grab a heap dump from the executor java process when you notice one that's
having problems.

On Wed, Jan 9, 2019 at 10:18 AM William Shen <>

> Hi there,
> We've encountered Spark executor Java OOM issues for our Spark
> application. Any tips on how to troubleshoot to identify what objects are
> occupying the heap? In the past, dealing with JVM OOM, we've worked with
> analyzing heap dumps, but we are having a hard time with locating Spark
> heap dump after a crash, and we also anticipate that these heap dump will
> be huge (since our nodes have a large memory allocation) and may be
> difficult to analyze locally. Can someone share their experience dealing
> with Spark OOM?
> Thanks!

View raw message