spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Patrick Wendell (JIRA)" <j...@apache.org>
Subject [jira] [Resolved] (SPARK-5144) spark-yarn module should be published
Date Wed, 28 Jan 2015 10:21:35 GMT

     [ https://issues.apache.org/jira/browse/SPARK-5144?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Patrick Wendell resolved SPARK-5144.
------------------------------------
    Resolution: Duplicate

> spark-yarn module should be published
> -------------------------------------
>
>                 Key: SPARK-5144
>                 URL: https://issues.apache.org/jira/browse/SPARK-5144
>             Project: Spark
>          Issue Type: Bug
>          Components: YARN
>    Affects Versions: 1.2.0
>            Reporter: Aniket Bhatnagar
>
> We disabled publishing of certain modules in SPARK-3452. One of such modules is spark-yarn.
This breaks applications that submit spark jobs programatically with master set as yarn-client.
This is because SparkContext is dependent on classes from yarn-client module to submit the
YARN application. 
> Here is the stack trace that you get if you submit the spark job without yarn-client
dependency:
> 2015-01-07 14:39:22,799 [pool-10-thread-13] [info] o.a.s.s.MemoryStore - MemoryStore
started with capacity 731.7 MB
> Exception in thread "pool-10-thread-13" java.lang.ExceptionInInitializerError
> at org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:1784)
> at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:105)
> at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:180)
> at org.apache.spark.SparkEnv$.create(SparkEnv.scala:292)
> at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:159)
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:232)
> at com.myimpl.Server:23)
> at scala.util.Success$$anonfun$map$1.apply(Try.scala:236)
> at scala.util.Try$.apply(Try.scala:191)
> at scala.util.Success.map(Try.scala:236)
> at com.myimpl.FutureTry$$anonfun$1.apply(FutureTry.scala:23)
> at com.myimpl.FutureTry$$anonfun$1.apply(FutureTry.scala:23)
> at scala.util.Success$$anonfun$map$1.apply(Try.scala:236)
> at scala.util.Try$.apply(Try.scala:191)
> at scala.util.Success.map(Try.scala:236)
> at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
> at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
> at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: org.apache.spark.SparkException: Unable to load YARN support
> at org.apache.spark.deploy.SparkHadoopUtil$.liftedTree1$1(SparkHadoopUtil.scala:199)
> at org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala:194)
> at org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.scala)
> ... 27 more
> Caused by: java.lang.ClassNotFoundException: org.apache.spark.deploy.yarn.YarnSparkHadoopUtil
> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Class.java:190)
> at org.apache.spark.deploy.SparkHadoopUtil$.liftedTree1$1(SparkHadoopUtil.scala:195)
> ... 29 more



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message