spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sumona Routh <>
Subject Spark UI shows finished when job had an error
Date Fri, 17 Jun 2016 13:49:33 GMT
Hi there,
Our Spark job had an error (specifically the Cassandra table definition did
not match what was in Cassandra), which threw an exception that logged out
to our spark-submit log.
However ,the UI never showed any failed stage or job. It appeared as if the
job finished without error, which is not correct.

We are trying to define our monitoring for our scheduled jobs, and we
intended to use the Spark UI to catch issues. Can we explain why the UI
would not report an exception like this? Is there a better approach we
should use for tracking failures in a Spark job?

We are currently on 1.2 standalone, however we do intend to upgrade to 1.6


View raw message