spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From William Shen <wills...@marinsoftware.com>
Subject Troubleshooting Spark OOM
Date Wed, 09 Jan 2019 18:18:25 GMT
Hi there,

We've encountered Spark executor Java OOM issues for our Spark application.
Any tips on how to troubleshoot to identify what objects are occupying the
heap? In the past, dealing with JVM OOM, we've worked with analyzing heap
dumps, but we are having a hard time with locating Spark heap dump after a
crash, and we also anticipate that these heap dump will be huge (since our
nodes have a large memory allocation) and may be difficult to analyze
locally. Can someone share their experience dealing with Spark OOM?

Thanks!

Mime
View raw message