spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Serega Sheypak <serega.shey...@gmail.com>
Subject Run spark 2.2 on yarn as usual java application
Date Sun, 18 Mar 2018 23:19:39 GMT
Hi, Is it even possible to run spark on yarn as usual java application?
I've built jat using maven with spark-yarn dependency and I manually
populate SparkConf with all hadoop properties.
SparkContext fails to start with exception:

   1. Caused by: java.lang.IllegalStateException: Library directory
   '/hadoop/yarn/local/usercache/root/appcache/application_1521375636129_0022/container_e06_1521375636129_0022_01_000002/assembly/target/scala-2.11/jars'
   does not exist; make sure Spark is built.
   2. at org.apache.spark.launcher.CommandBuilderUtils.checkState(
   CommandBuilderUtils.java:260)
   3. at org.apache.spark.launcher.CommandBuilderUtils.findJarsDir(
   CommandBuilderUtils.java:359)
   4. at org.apache.spark.launcher.YarnCommandBuilderUtils$.findJarsDir(
   YarnCommandBuilderUtils.scala:38)


I took a look at the code and it has some hardcodes and checks for specific
files layout. I don't follow why :)
Is it possible to bypass such checks?

Mime
View raw message