So the only way that I could make this work was to build a fat jar file as suggested earlier. To me (and I am no expert) it seems like this is a bug. Everything was working for me prior to our upgrade to Spark 1.1 on Hadoop 2.2 but now it seems to not... ie packaging my jars locally then pushing them out to the cluster and pointing them to corresponding dependent jars....
Sorry I cannot be more help!Jᐧ
DATA SCIENTIST (NERD)
. . . . . . . . . . . . . . . . . .
IF WE CAN’T DOUBLE YOUR SALES,
ONE OF US IS IN THE WRONG BUSINESS.
On Tue, Oct 14, 2014 at 4:59 AM, Christophe Préaud <email@example.com> wrote:
I have already posted a message with the exact same problem, and proposed a patch (the subject is "Application failure in yarn-cluster mode").
Can you test it, and see if it works for you?
I would be glad too if someone can confirm that it is a bug in Spark 1.1.0.
On 14/10/2014 03:15, Jimmy McErlain wrote:
BTW this has always worked for me before until we upgraded the cluster to Spark 1.1.1...Jᐧ
On Mon, Oct 13, 2014 at 5:39 PM, HARIPRIYA AYYALASOMAYAJULA <firstname.lastname@example.org> wrote:
Can you check if the jar file is available in the target->scala-2.10 folder?
When you use sbt package to make the jar file, that is where the jar file would be located.
The following command works well for me:
spark-submit --class “Classname" --master yarn-cluster jarfile(withcomplete path)
Can you try checking with this initially and later add other options?--
On Mon, Oct 13, 2014 at 7:36 PM, Jimmy <email@example.com> wrote:
Having the exact same error with the exact same jar.... Do you work for Altiscale? :)J
Sent from my iPhone
On Oct 13, 2014, at 5:33 PM, Andy Srine <firstname.lastname@example.org> wrote:
Spark rookie here. I am getting a file not found exception on the --jars. This is on the yarn cluster mode and I am running the following command on our recently upgraded Spark 1.1.1 environment.
./bin/spark-submit --verbose --master yarn --deploy-mode cluster --class myEngine --driver-memory 1g --driver-library-path /hadoop/share/hadoop/mapreduce/lib/hadoop-lzo-0.4.18-201406111750.jar --executor-memory 5g --executor-cores 5 --jars /home/andy/spark/lib/joda-convert-1.2.jar --queue default --num-executors 4 /home/andy/spark/lib/my-spark-lib_1.0.jar
This is the error I am hitting. Any tips would be much appreciated. The file permissions looks fine on my local disk.
14/10/13 22:49:39 INFO yarn.ApplicationMaster: Unregistering ApplicationMaster with FAILED
14/10/13 22:49:39 INFO impl.AMRMClientImpl: Waiting for application to be successfully unregistered.
Exception in thread "Driver" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 3 in stage 1.0 failed 4 times, most recent failure: Lost task 3.3 in stage 1.0 (TID 12, 122-67.vb2.company.com): java.io.FileNotFoundException: ./joda-convert-1.2.jar (Permission denied)
Société par Actions Simplifiée
Au capital de € 4.168.964,30
Siège social : 8, rue du Sentier 75002 Paris
425 093 069 RCS Paris
Ce message et les pièces jointes sont confidentiels et établis à l'attention exclusive de leurs destinataires. Si vous n'êtes pas le destinataire de ce message, merci de le détruire et d'en avertir l'expéditeur.