spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From <spark....@yahoo.com.INVALID>
Subject Re: Why does spark 1.6.0 can't use jar files stored on HDFS
Date Tue, 17 May 2016 13:41:38 GMT
Hi Serega,
Create a jar including all the the dependencies and execute it like below through shell script

/usr/local/spark/bin/spark-submit \  //location of your spark-submit
--class classname \  //location of your main classname
--master yarn \
--deploy-mode cluster \
/home/hadoop/SparkSampleProgram.jar  //location of your jar file

ThanksRaj
 

Sent from Yahoo Mail. Get the app 

    On Tuesday, May 17, 2016 6:03 PM, Serega Sheypak <serega.sheypak@gmail.com> wrote:
 

 hi, I'm trying to:1. upload my app jar files to HDFS2. run spark-submit with:2.1. --master
yarn --deploy-mode clusteror2.2. --master yarn --deploy-mode client
specifying --jars hdfs:///my/home/commons.jar,hdfs:///my/home/super.jar 
When spark job is submitted, SparkSubmit client outputs:Warning: Skip remote jar hdfs:///user/baba/lib/akka-slf4j_2.11-2.3.11.jar
...

and then spark application main class fails with class not found exception.Is there any workaround?

  
Mime
View raw message