drill-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Kunal Khatua <kkha...@mapr.com>
Subject RE: Spark exception crashing application
Date Mon, 18 Sep 2017 18:07:18 GMT
The exceptions/error (NoClassDefFoundError) appear to indicate that there is a possible mismatch
between the library versions of Spark within Drill and the platform you are running on..

Can you start by verifying the versions of the Spark libraries you have and then try to build
Drill (edit the pom.xml ) with the matching versions? 

-----Original Message-----
From: Sing, Jasbir [mailto:jasbir.sing@accenture.com] 
Sent: Monday, September 18, 2017 3:35 AM
To: user@drill.apache.org
Subject: Spark exception crashing application

Hi,

I am intermittently getting below exceptions which is crashing the application and Jetty goes
down.

Can someone please help me out what is the reason behind below exceptions.

Exception - 1

2017-09-18 03:23:40 [WARN ] []  @ ShutdownHookManager$1 : run : 56 - ShutdownHook '$anon$2'
failed, java.lang.NoClassDefFoundError: org/apache/spark/util/Utils$$anonfun$logUncaughtExceptions$1
java.lang.NoClassDefFoundError: org/apache/spark/util/Utils$$anonfun$logUncaughtExceptions$1
      at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1704)
      at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:234)
      at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:234)
      at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:234)
      at scala.util.Try$.apply(Try.scala:161)
      at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:234)
      at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:216)
      at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.util.Utils$$anonfun$logUncaughtExceptions$1
      at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
      at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
      at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
      at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
      at org.eclipse.jetty.webapp.WebAppClassLoader.loadClass(WebAppClassLoader.java:487)
      at org.eclipse.jetty.webapp.WebAppClassLoader.loadClass(WebAppClassLoader.java:428)
           ... 8 more

Exception -2


2017-09-18 03:23:40.374:WARN:osjuc.AbstractLifeCycle:FAILED org.spark-project.jetty.servlet.ServletHandler@3b67cdf9:
java.lang.NoClassDefFoundError: org/spark-project/jetty/servlet/FilterMapping
java.lang.NoClassDefFoundError: org/spark-project/jetty/servlet/FilterMapping
      at org.spark-project.jetty.servlet.ServletHandler.doStop(ServletHandler.java:229)
      at org.spark-project.jetty.util.component.AbstractLifeCycle.stop(AbstractLifeCycle.java:89)
      at org.spark-project.jetty.server.handler.HandlerWrapper.doStop(HandlerWrapper.java:107)
      at org.spark-project.jetty.server.handler.ContextHandler.doStop(ContextHandler.java:815)
      at org.spark-project.jetty.servlet.ServletContextHandler.doStop(ServletContextHandler.java:160)
      at org.spark-project.jetty.util.component.AbstractLifeCycle.stop(AbstractLifeCycle.java:89)
      at org.spark-project.jetty.server.handler.HandlerWrapper.doStop(HandlerWrapper.java:107)
      at org.spark-project.jetty.util.component.AbstractLifeCycle.stop(AbstractLifeCycle.java:89)
      at org.spark-project.jetty.server.handler.HandlerCollection.doStop(HandlerCollection.java:250)
      at org.spark-project.jetty.util.component.AbstractLifeCycle.stop(AbstractLifeCycle.java:89)
      at org.spark-project.jetty.server.handler.HandlerWrapper.doStop(HandlerWrapper.java:107)
      at org.spark-project.jetty.server.Server.doStop(Server.java:343)
      at org.spark-project.jetty.util.component.AbstractLifeCycle.stop(AbstractLifeCycle.java:89)
      at org.apache.spark.ui.WebUI.stop(WebUI.scala:152)
      at org.apache.spark.ui.SparkUI.stop(SparkUI.scala:85)
      at org.apache.spark.SparkContext$$anonfun$stop$2$$anonfun$apply$mcV$sp$2.apply(SparkContext.scala:1704)
      at org.apache.spark.SparkContext$$anonfun$stop$2$$anonfun$apply$mcV$sp$2.apply(SparkContext.scala:1704)
      at scala.Option.foreach(Option.scala:236)
      at org.apache.spark.SparkContext$$anonfun$stop$2.apply$mcV$sp(SparkContext.scala:1704)
      at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1185)
      at org.apache.spark.SparkContext.stop(SparkContext.scala:1703)
      at org.apache.spark.SparkContext$$anonfun$3.apply$mcV$sp(SparkContext.scala:587)
      at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:264)
      at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:234)
      at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:234)
      at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:234)
      at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699)
      at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:234)
      at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:234)
      at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:234)
      at scala.util.Try$.apply(Try.scala:161)
      at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:234)
      at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:216)
           at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)




________________________________

This message is for the designated recipient only and may contain privileged, proprietary,
or otherwise confidential information. If you have received it in error, please notify the
sender immediately and delete the original. Any other use of the e-mail by you is prohibited.
Where allowed by local law, electronic communications with Accenture and its affiliates, including
e-mail and instant messaging (including content), may be scanned by our systems for the purposes
of information security and assessment of internal compliance with Accenture policy.
______________________________________________________________________________________

www.accenture.com
Mime
View raw message