spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Chethan Bhawarlal <>
Subject is there a way to catch exceptions on executor level
Date Thu, 08 Mar 2018 06:06:53 GMT
Hi Dev,

I am doing spark operations on Rdd level for each row like this,

 private def obj(row: org.apache.spark.sql.Row): Put = {

    row.schema.fields.foreach(x => {

      x.dataType match {

           case (StringType)    => //some operation

so, when I get some empty or garbage value my code fails and I am not able
to catch the exceptions as these failures are occurring at executors.

is there a way I can catch these exceptions and accumulate them and print
to driver logs?

any sample examples provided will be of great help.



Collective[i] dramatically improves sales and marketing performance using 
technology, applications and a revolutionary network designed to provide 
next generation analytics and decision-support directly to business users. 
Our goal is to maximize human potential and minimize mistakes. In most 
cases, the results are astounding. We cannot, however, stop emails from 
sometimes being sent to the wrong person. If you are not the intended 
recipient, please notify us by replying to this email's sender and deleting 
it (and any attachments) permanently from your system. If you are, please 
respect the confidentiality of this communication's contents.

View raw message