spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Artemis User <arte...@dtechspace.com>
Subject Re: mysql connector java issue
Date Fri, 11 Dec 2020 14:57:35 GMT
Well, this just won't work when you are running Spark on Hadoop...

On 12/10/20 9:14 PM, lec ssmi wrote:
> If you can not assembly the jdbc driver jar in your application jar 
> package, you can put the jdbc driver jar in the spark classpath, 
> generally, $SPARK_HOME/jars  or $SPARK_HOME/lib.
>
>
> Artemis User <artemis@dtechspace.com <mailto:artemis@dtechspace.com>> 
> 于2020年12月11日周五 上午5:21写道:
>
>     What happened was that you made the mysql jar file only available
>     to the spark driver, not the executors.  Use the --jars parameter
>     instead of driver-class-path to specify your third-party jar
>     files, or copy the third-party jar files to the jars directory for
>     Spark in your HDFS, and specify the path of HDFS using --archives
>     in spark-submit.
>
>     -- ND
>
>     On 12/10/20 10:02 AM, ismail elhammoud wrote:
>>     Hello,
>>
>>     Guys I have an issue with mysql connector java, even if I
>>     declared it in sbt file It couldn't work if I don't give the
>>     whole path
>>
>>     spark-submit --master yarn --driver-class-path
>>     /home/node2/Téléchargements/mysql-connector-java-5.1.24-bin.jar
>>     ./Sdatahub-assembly-0.1.jar
>>
>>
>>     Regards,
>>     Isma
>

Mime
View raw message