spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Michel Hubert <mich...@phact.nl>
Subject submitMissingTasks - serialize throws StackOverflow exception
Date Fri, 27 May 2016 06:55:46 GMT
Hi,

My Spark application throws stackoverflow exceptions after a while.
The DAGScheduler function submitMissingTasks tries to serialize a Tuple (MapPartitionsRDD,
EsSpark..saveToEs) which is handled with a recursive algorithm.
The recursive algorithm is too deep and results in a stackoverflow exception.

Should I just try to increase the heap size? Or will it just happen later in time?

How can I fix this?

With kind regards,
michel



[cid:image001.png@01D1B7F5.8E8C3980]

Mime
View raw message