spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jacek Laskowski <ja...@japila.pl>
Subject Re: Run spark 2.2 on yarn as usual java application
Date Mon, 19 Mar 2018 06:16:00 GMT
Hi,

What's the deployment process then (if not using spark-submit)? How is the
AM deployed? Why would you want to skip spark-submit?

Jacek

On 19 Mar 2018 00:20, "Serega Sheypak" <serega.sheypak@gmail.com> wrote:

> Hi, Is it even possible to run spark on yarn as usual java application?
> I've built jat using maven with spark-yarn dependency and I manually
> populate SparkConf with all hadoop properties.
> SparkContext fails to start with exception:
>
>    1. Caused by: java.lang.IllegalStateException: Library directory
>    '/hadoop/yarn/local/usercache/root/appcache/application_
>    1521375636129_0022/container_e06_1521375636129_0022_01_
>    000002/assembly/target/scala-2.11/jars' does not exist; make sure Spark
>    is built.
>    2. at org.apache.spark.launcher.CommandBuilderUtils.checkState(Com
>    mandBuilderUtils.java:260)
>    3. at org.apache.spark.launcher.CommandBuilderUtils.findJarsDir(Co
>    mmandBuilderUtils.java:359)
>    4. at org.apache.spark.launcher.YarnCommandBuilderUtils$.findJarsDir(
>    YarnCommandBuilderUtils.scala:38)
>
>
> I took a look at the code and it has some hardcodes and checks for
> specific files layout. I don't follow why :)
> Is it possible to bypass such checks?
>

Mime
View raw message