spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Dawson Choong (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-15368) Spark History Server does not pick up extraClasspath
Date Mon, 23 May 2016 18:25:12 GMT

    [ https://issues.apache.org/jira/browse/SPARK-15368?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15296818#comment-15296818
] 

Dawson Choong commented on SPARK-15368:
---------------------------------------

Thank you for the suggestions. [~jerryshao] may I ask if there is a more up-to-date alternative
for accomplishing this instead of using the deprecated {{SPARK_CLASSPATH}}? Thanks.

> Spark History Server does not pick up extraClasspath
> ----------------------------------------------------
>
>                 Key: SPARK-15368
>                 URL: https://issues.apache.org/jira/browse/SPARK-15368
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.6.1
>         Environment: HDP-2.4
> CentOS
>            Reporter: Dawson Choong
>
> We've encountered a problem where the Spark History Server is not picking up on the {{spark.driver.extraClassPath}}
paremter in the {{Custom spark-defaults}} inside Ambari. Because the needed JARs are not being
picked up, this is leading to {{ClassNotFoundException}}. (Our current workaround is to manually
export the JARs in the Spark-env.)
> Log file:
> Spark Command: /usr/java/default/bin/java -Dhdp.version=2.4.0.0-169 -cp /usr/hdp/2.4.0.0-169/spark/sbin/../conf/:/usr/hdp/2.4.0.0-169/spark/lib/spark-assembly-1.6.0.2.4.0.0-169-hadoop2.7.1.2.4.0.0-169.jar:/usr/hdp/2.4.0.0-169/spark/lib/datanucleus-core-3.2.10.jar:/usr/hdp/2.4.0.0-169/spark/lib/datanucleus-rdbms-3.2.9.jar:/usr/hdp/2.4.0.0-169/spark/lib/datanucleus-api-jdo-3.2.6.jar:/usr/hdp/current/hadoop-client/conf/
-Xms1g -Xmx1g -XX:MaxPermSize=256m org.apache.spark.deploy.history.HistoryServer
> ========================================
> 16/04/12 12:23:44 INFO HistoryServer: Registered signal handlers for [TERM, HUP, INT]
> 16/04/12 12:23:45 WARN NativeCodeLoader: Unable to load native-hadoop library for your
platform... using builtin-java classes where applicable
> 16/04/12 12:23:45 INFO SecurityManager: Changing view acls to: spark
> 16/04/12 12:23:45 INFO SecurityManager: Changing modify acls to: spark
> 16/04/12 12:23:45 INFO SecurityManager: SecurityManager: authentication disabled; ui
acls disabled; users with view permissions: Set(spark); users with modify permissions: Set(spark)
> Exception in thread "main" java.lang.reflect.InvocationTargetException
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> 	at org.apache.spark.deploy.history.HistoryServer$.main(HistoryServer.scala:235)
> 	at org.apache.spark.deploy.history.HistoryServer.main(HistoryServer.scala)
> Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class com.wandisco.fs.client.FusionHdfs
not found
> 	at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2195)
> 	at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2638)
> 	at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2651)
> 	at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:92)
> 	at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2687)
> 	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2669)
> 	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:371)
> 	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:362)
> 	at org.apache.spark.util.Utils$.getHadoopFileSystem(Utils.scala:1650)
> 	at org.apache.spark.util.Utils$.getHadoopFileSystem(Utils.scala:1657)
> 	at org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:71)
> 	at org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:49)
> 	... 6 more
> Caused by: java.lang.ClassNotFoundException: Class com.wandisco.fs.client.FusionHdfs
not found
> 	at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2101)
> 	at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2193)
> 	... 17 more



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message