Hi, Is it even possible to run spark on yarn as usual java application?
I've built jat using maven with spark-yarn dependency and I manually populate SparkConf with all hadoop properties. 
SparkContext fails to start with exception:
  1. Caused by: java.lang.IllegalStateException: Library directory '/hadoop/yarn/local/usercache/root/appcache/application_1521375636129_0022/container_e06_1521375636129_0022_01_000002/assembly/target/scala-2.11/jars' does not exist; make sure Spark is built.
  2. at org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:260)
  3. at org.apache.spark.launcher.CommandBuilderUtils.findJarsDir(CommandBuilderUtils.java:359)
  4. at org.apache.spark.launcher.YarnCommandBuilderUtils$.findJarsDir(YarnCommandBuilderUtils.scala:38)

I took a look at the code and it has some hardcodes and checks for specific files layout. I don't follow why :)
Is it possible to bypass such checks?