spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Corey Nolet <cjno...@gmail.com>
Subject Re: Submitting spark jobs through yarn-client
Date Fri, 02 Jan 2015 22:02:51 GMT
Looking a little closer @ the launch_container.sh file, it appears to be
adding a $PWD/__app__.jar to the classpath but there is no __app__.jar in
the directory pointed to by PWD. Any ideas?

On Fri, Jan 2, 2015 at 4:20 PM, Corey Nolet <cjnolet@gmail.com> wrote:

> I'm trying to get a SparkContext going in a web container which is being
> submitted through yarn-client. I'm trying two different approaches and both
> seem to be resulting in the same error from the yarn nodemanagers:
>
> 1) I'm newing up a spark context direct, manually adding all the lib jars
> from Spark and Hadoop to the setJars() method on the SparkConf.
>
> 2) I'm using SparkSubmit,main() to pass the classname and jar containing
> my code.
>
>
> When yarn tries to create the container, I get an exception in the driver
> "Yarn application already ended, might be killed or not able to launch
> application master". When I look into the logs for the nodemanager, I see
> "NoClassDefFoundError: org/apache/spark/Logging.
>
> Looking closer @ the contents of the nodemanagers, I see that the spark
> yarn jar was renamed to __spark__.jar and placed in the app cache while the
> rest of the libraries I specified via setJars() were all placed in the file
> cache. Any ideas as to what may be happening? I even tried adding the
> spark-core dependency and uber-jarring my own classes so that the
> dependencies would be there when Yarn tries to create the container.
>

Mime
View raw message