spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Michel Hubert <>
Subject submitMissingTasks - serialize throws StackOverflow exception
Date Fri, 27 May 2016 06:55:46 GMT

My Spark application throws stackoverflow exceptions after a while.
The DAGScheduler function submitMissingTasks tries to serialize a Tuple (MapPartitionsRDD,
EsSpark..saveToEs) which is handled with a recursive algorithm.
The recursive algorithm is too deep and results in a stackoverflow exception.

Should I just try to increase the heap size? Or will it just happen later in time?

How can I fix this?

With kind regards,


View raw message