Piotr Kołaczkowski created SPARK-6187:
-----------------------------------------
Summary: Report full executor exceptions to the driver
Key: SPARK-6187
URL: https://issues.apache.org/jira/browse/SPARK-6187
Project: Spark
Issue Type: Improvement
Components: Spark Core
Affects Versions: 1.2.1
Reporter: Piotr Kołaczkowski
If the task fails for some reason, the driver seems to report only the top-level exception,
without the cause(s). While it is possible to recover the full stacktrace from executor's
logs, it is quite annoying and would be better to just report the full stacktrace, with all
the causes to the driver application.
Example stacktrace I just got:
{noformat}
org.apache.spark.SparkException: Job aborted due to stage failure: Task 5 in stage 0.0
failed 1 times, most recent failure: Lost task 5.0 in stage 0.0 (TID 5, localhost): java.lang.NoClassDefFoundError:
Could not initialize class org.apache.cassandra.db.Keyspace
at com.datastax.bdp.spark.writer.BulkTableWriter.writeSSTables(BulkTableWriter.scala:194)
at com.datastax.bdp.spark.writer.BulkTableWriter.write(BulkTableWriter.scala:223)
at com.datastax.bdp.spark.writer.BulkTableWriter$BulkSaveRDDFunctions$$anonfun$bulkSaveToCassandra$1.apply(BulkTableWriter.scala:280)
at com.datastax.bdp.spark.writer.BulkTableWriter$BulkSaveRDDFunctions$$anonfun$bulkSaveToCassandra$1.apply(BulkTableWriter.scala:280)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
at org.apache.spark.scheduler.Task.run(Task.scala:56)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:200)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
{noformat}
As you see, this is not very informative.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org
|