spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jack Arenas...@ckarenas.com>
Subject JDBC DF using DB2
Date Mon, 23 Mar 2015 19:34:25 GMT
Hi Team, 
 
I’m trying to create a DF using jdbc as detailed here – I’m currently using DB2 v9.7.0.6
and I’ve tried to use the db2jcc.jar and db2jcc_license_cu.jar combo, and while it works
in --master local using the command below, I get some strange behavior in --master yarn-client.
Here is the command:
 
val df = sql.load("jdbc", Map("url" -> "jdbc:db2://<host>:<port>/<db>:currentSchema=<schema>;user=<user>;password=<password>;",
"driver" -> "com.ibm.db2.jcc.DB2Driver", "dbtable" -> "<table>"))
 
It seems to also be working on yarn-client because once executed I get the following log:
df: org.apache.spark.sql.DataFrame = [DATE_FIELD: date, INT_FIELD: int, DOUBLE_FIELD: double]
 
Which indicates me that Spark was able to connect to the DB. But once I run df.count() or
df.take(5).foreach(println) in order to operate on the data and get a result, I get back a
‘No suitable driver found’ exception, which makes me think the driver wasn’t shipped
with the spark job.
 
I’ve tried using --driver-class-path, --jars, SPARK_CLASSPATH to add the jars to the spark
job. I also have the jars in my$CLASSPATH and $HADOOP_CLASSPATH.
 
I also saw this in the trouble shooting section, but quite frankly I’m not sure what primordial
class loader it’s talking about:
 
The JDBC driver class must be visible to the primordial class loader on the client session
and on all executors. This is because Java’s DriverManager class does a security check that
results in it ignoring all drivers not visible to the primordial class loader when one goes
to open a connection. One convenient way to do this is to modify compute_classpath.sh on all
worker nodes to include your driver JARs.

Any advice is welcome!
 
Thanks,
Jack
 
 
 
Mime
View raw message