spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Marcelo Vanzin <van...@cloudera.com>
Subject Re: Why add --driver-class-path jbdc.jar works and --jars not? (1.6.1)
Date Wed, 05 Oct 2016 16:20:43 GMT
Many (all?) JDBC drivers need to be in the system classpath. --jars
places them in an app-specific class loader, so it doesn't work.

On Wed, Oct 5, 2016 at 3:32 AM, Chanh Le <giaosudau@gmail.com> wrote:
> Hi everyone,
> I just wondering why when I run my program I need to add jdbc.jar into
> —driver-class-path instead treat it like a dependency by —jars.
>
> My program works with these config
> ./bin/spark-submit --packages
> org.apache.spark:spark-streaming-kafka_2.10:1.6.1 --master "local[4]"
> --class com.ants.util.kafka.PersistenceData --driver-class-path
> /Users/giaosudau/Downloads/postgresql-9.3-1102.jdbc41.jar
> /Users/giaosudau/workspace/KafkaJobs/target/scala-2.10/kafkajobs-prod.jar
>
> According by http://stackoverflow.com/a/30947090/523075 and
> http://stackoverflow.com/a/31012955/523075
>
> This is a bug related the the classloader
>
>
> I checked this https://github.com/apache/spark/pull/6900 was merged.
>
> I am using Spark 1.6.1 and by issue tell that already fixed in 1.4.1 and 1.5
>
>
> Regards,
> Chanh



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message