spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From gbop <lij.ta...@gmail.com>
Subject new 1.5.1 behavior - exception on executor throws ClassNotFound on driver
Date Mon, 19 Oct 2015 18:15:53 GMT
I've been struggling with a particularly puzzling issue after upgrading to
Spark 1.5.1 from Spark 1.4.1.

When I use the MySQL JDBC connector and an exception (e.g.
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException) is thrown on the
executor, I get a ClassNotFoundException on the driver, which results in
this error (logs are abbreviated):



 In Spark 1.4.1, I get the following (logs are abbreviated):



I have seriously screwed up somewhere or this is a change in behavior that I
have not been able to find in the documentation. For those that are
interested, a full repro and logs follow.


---

I am running this on Spark 1.5.1+Hadoop 2.6. I have tried this in various
combinations of 
 * local/standalone mode
 * putting mysql on the classpath with --jars/building a fat jar with mysql
in it/manually running sc.addJar on the mysql jar 
 * --deploy-mode client/--deploy-mode cluster
but nothing seems to change.



Here is an example invocation, and the accompanying source code:




The source code:



And the build.sbt:




And here are the results when run against Spark 1.4.1 (build.sbt has been
updated accordingly)





--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/new-1-5-1-behavior-exception-on-executor-throws-ClassNotFound-on-driver-tp25124.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message