spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Devl Devel <devl.developm...@gmail.com>
Subject Recreating JIRA SPARK-8142
Date Tue, 09 Jun 2015 09:28:49 GMT
Hi All

We are having some trouble with:

sparkConf.set("spark.driver.userClassPathFirst","true");
sparkConf.set("spark.executor.userClassPathFirst","true");

and would appreciate some independent verification. The issue comes down to
this:

Spark 1.3.1 hadoop 2.6 is deployed on the cluster. In my application code I
use maven to bring in:

hadoop-common 2.6.0 - provided
hadoop-client 2.6.0 provided
hadoop -hdfs 2.6.0 provided
spark-sql_s.10 provided
spark-core_2.10 provided
hbase-client 1.1.0 included.packaged
hbase -protocol 1.1.0 included/packaged
hbase -server 1.1.0 included/packaged

When I set userClasspath* to true I get a ClassCastException: Full details
are in

https://issues.apache.org/jira/browse/SPARK-8142

Can someone help verify this? I.e. If you have spark 1.3.1, Hadoop and
Hbase can you create a simple Spark job say to read an HBase table into a
RDD. Then set the Spark and Hadoop dependencies above as "provided" and
then set:

sparkConf.set("spark.driver.userClassPathFirst","true");
sparkConf.set("spark.executor.userClassPathFirst","true");

and repeat the job.

Do you get the same exception in the JIRA or missing classes or event run
into this https://issues.apache.org/jira/browse/SPARK-1867?

Please comment on the JIRA, it would be useful to have a second
verification of this. I know userClasspath* options are experimental but
it's good to know what's going on.

Cheers
Devl

Mime
View raw message