spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From shyla deshpande <deshpandesh...@gmail.com>
Subject spark job error
Date Tue, 30 Jan 2018 16:52:37 GMT
I am running Zeppelin on EMR. with the default settings.  I am getting the
following error. Restarting the Zeppelin application fixes the problem.

What default settings do I need to override that will help fix this error.

org.apache.spark.SparkException: Job aborted due to stage failure: Task 71
in stage 231.0 failed 4 times, most recent failure: Lost task 71.3 in stage
231.0 Reason: Container killed by YARN for exceeding memory limits. 1.4 GB
of 1.4 GB physical memory used. Consider boosting
spark.yarn.executor.memoryOverhead.

Thanks

Mime
View raw message