spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Yash Sharma <yash...@gmail.com>
Subject Re: Could not find or load main class org.apache.spark.deploy.yarn.ExecutorLauncher
Date Wed, 22 Jun 2016 06:42:11 GMT
Ok, we moved to the next level :)

Could you share more info on the error. You could get logs by the command -

yarn logs -applicationId application_1466568126079_0006


On Wed, Jun 22, 2016 at 4:38 PM, 另一片天 <958943172@qq.com> wrote:

> shihj@master:/usr/local/spark/spark-1.6.1-bin-hadoop2.6$
> ./bin/spark-submit \
> > --class org.apache.spark.examples.SparkPi \
> > --master yarn-cluster \
> > --driver-memory 512m \
> > --num-executors 2 \
> > --executor-memory 512m \
> > --executor-cores 2 \
> >
> hdfs://master:9000/user/shihj/spark_lib/spark-examples-1.6.1-hadoop2.6.0.jar
> 16/06/22 14:36:10 INFO RMProxy: Connecting to ResourceManager at master/
> 192.168.20.137:8032
> 16/06/22 14:36:10 INFO Client: Requesting a new application from cluster
> with 2 NodeManagers
> 16/06/22 14:36:10 INFO Client: Verifying our application has not requested
> more than the maximum memory capability of the cluster (8192 MB per
> container)
> 16/06/22 14:36:10 INFO Client: Will allocate AM container, with 896 MB
> memory including 384 MB overhead
> 16/06/22 14:36:10 INFO Client: Setting up container launch context for our
> AM
> 16/06/22 14:36:10 INFO Client: Setting up the launch environment for our
> AM container
> 16/06/22 14:36:10 INFO Client: Preparing resources for our AM container
> Java HotSpot(TM) Server VM warning: You have loaded library
> /tmp/libnetty-transport-native-epoll3453573359049032130.so which might have
> disabled stack guard. The VM will try to fix the stack guard now.
> It's highly recommended that you fix the library with 'execstack -c
> <libfile>', or link it with '-z noexecstack'.
> 16/06/22 14:36:11 INFO Client: Source and destination file systems are the
> same. Not copying
> hdfs://master:9000/user/shihj/spark_lib/spark-examples-1.6.1-hadoop2.6.0.jar
> 16/06/22 14:36:11 WARN Client: Resource
> hdfs://master:9000/user/shihj/spark_lib/spark-examples-1.6.1-hadoop2.6.0.jar
> added multiple times to distributed cache.
> 16/06/22 14:36:11 INFO Client: Uploading resource
> file:/tmp/spark-cf23c5a3-d3fb-4f98-9cd2-bbf268766bbc/__spark_conf__7248368026523433025.zip
> ->
> hdfs://master:9000/user/shihj/.sparkStaging/application_1466568126079_0006/__spark_conf__7248368026523433025.zip
> 16/06/22 14:36:13 INFO SecurityManager: Changing view acls to: shihj
> 16/06/22 14:36:13 INFO SecurityManager: Changing modify acls to: shihj
> 16/06/22 14:36:13 INFO SecurityManager: SecurityManager: authentication
> disabled; ui acls disabled; users with view permissions: Set(shihj); users
> with modify permissions: Set(shihj)
> 16/06/22 14:36:13 INFO Client: Submitting application 6 to ResourceManager
> 16/06/22 14:36:13 INFO YarnClientImpl: Submitted application
> application_1466568126079_0006
> 16/06/22 14:36:14 INFO Client: Application report for
> application_1466568126079_0006 (state: ACCEPTED)
> 16/06/22 14:36:14 INFO Client:
> client token: N/A
> diagnostics: N/A
> ApplicationMaster host: N/A
> ApplicationMaster RPC port: -1
> queue: default
> start time: 1466577373576
> final status: UNDEFINED
> tracking URL: http://master:8088/proxy/application_1466568126079_0006/
> user: shihj
> 16/06/22 14:36:15 INFO Client: Application report for
> application_1466568126079_0006 (state: ACCEPTED)
> 16/06/22 14:36:16 INFO Client: Application report for
> application_1466568126079_0006 (state: ACCEPTED)
> 16/06/22 14:36:17 INFO Client: Application report for
> application_1466568126079_0006 (state: ACCEPTED)
> 16/06/22 14:36:18 INFO Client: Application report for
> application_1466568126079_0006 (state: ACCEPTED)
> 16/06/22 14:36:19 INFO Client: Application report for
> application_1466568126079_0006 (state: ACCEPTED)
> 16/06/22 14:36:20 INFO Client: Application report for
> application_1466568126079_0006 (state: ACCEPTED)
> 16/06/22 14:36:21 INFO Client: Application report for
> application_1466568126079_0006 (state: ACCEPTED)
> 16/06/22 14:36:22 INFO Client: Application report for
> application_1466568126079_0006 (state: ACCEPTED)
> 16/06/22 14:36:23 INFO Client: Application report for
> application_1466568126079_0006 (state: ACCEPTED)
> 16/06/22 14:36:24 INFO Client: Application report for
> application_1466568126079_0006 (state: ACCEPTED)
> 16/06/22 14:36:25 INFO Client: Application report for
> application_1466568126079_0006 (state: ACCEPTED)
> 16/06/22 14:36:26 INFO Client: Application report for
> application_1466568126079_0006 (state: ACCEPTED)
> 16/06/22 14:36:27 INFO Client: Application report for
> application_1466568126079_0006 (state: FAILED)
> 16/06/22 14:36:27 INFO Client:
> client token: N/A
> diagnostics: Application application_1466568126079_0006 failed 2 times due
> to AM Container for appattempt_1466568126079_0006_000002 exited with
>  exitCode: 1
> For more detailed output, check application tracking page:
> http://master:8088/proxy/application_1466568126079_0006/Then, click on
> links to logs of each attempt.
> Diagnostics: Exception from container-launch.
> Container id: container_1466568126079_0006_02_000001
> Exit code: 1
> Stack trace: ExitCodeException exitCode=1:
> at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
> at org.apache.hadoop.util.Shell.run(Shell.java:455)
> at
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
> at
> org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:212)
> at
> org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
> at
> org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
>
>
> Container exited with a non-zero exit code 1
> Failing this attempt. Failing the application.
> ApplicationMaster host: N/A
> ApplicationMaster RPC port: -1
> queue: default
> start time: 1466577373576
> final status: FAILED
> tracking URL:
> http://master:8088/cluster/app/application_1466568126079_0006
> user: shihj
> 16/06/22 14:36:27 INFO Client: Deleting staging directory
> .sparkStaging/application_1466568126079_0006
> Exception in thread "main" org.apache.spark.SparkException: Application
> application_1466568126079_0006 finished with failed status
> at org.apache.spark.deploy.yarn.Client.run(Client.scala:1034)
> at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1081)
> at org.apache.spark.deploy.yarn.Client.main(Client.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> 16/06/22 14:36:27 INFO ShutdownHookManager: Shutdown hook called
> 16/06/22 14:36:27 INFO ShutdownHookManager: Deleting directory
> /tmp/spark-cf23c5a3-d3fb-4f98-9cd2-bbf268766bbc
>
>
>
> ------------------ 原始邮件 ------------------
> *发件人:* "Yash Sharma";<yash360@gmail.com>;
> *发送时间:* 2016年6月22日(星期三) 下午2:34
> *收件人:* "另一片天"<958943172@qq.com>;
> *抄送:* "Saisai Shao"<sai.sai.shao@gmail.com>; "user"<user@spark.apache.org>;
>
> *主题:* Re: Could not find or load main class
> org.apache.spark.deploy.yarn.ExecutorLauncher
>
> Try with : --master yarn-cluster
>
> On Wed, Jun 22, 2016 at 4:30 PM, 另一片天 <958943172@qq.com> wrote:
>
>> ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master
>> yarn-client --driver-memory 512m --num-executors 2 --executor-memory 512m
>> --executor-cores 2
>> hdfs://master:9000/user/shihj/spark_lib/spark-examples-1.6.1-hadoop2.6.0.jar
>> 10
>> Warning: Skip remote jar
>> hdfs://master:9000/user/shihj/spark_lib/spark-examples-1.6.1-hadoop2.6.0.jar.
>> java.lang.ClassNotFoundException: org.apache.spark.examples.SparkPi
>> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>> at java.lang.Class.forName0(Native Method)
>> at java.lang.Class.forName(Class.java:348)
>> at org.apache.spark.util.Utils$.classForName(Utils.scala:174)
>> at
>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:689)
>> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
>> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>
>>
>>
>> ------------------ 原始邮件 ------------------
>> *发件人:* "Yash Sharma";<yash360@gmail.com>;
>> *发送时间:* 2016年6月22日(星期三) 下午2:28
>> *收件人:* "另一片天"<958943172@qq.com>;
>> *抄送:* "Saisai Shao"<sai.sai.shao@gmail.com>; "user"<user@spark.apache.org>;
>>
>> *主题:* Re: Could not find or load main class
>> org.apache.spark.deploy.yarn.ExecutorLauncher
>>
>> Or better , try the master as yarn-cluster,
>>
>> ./bin/spark-submit \
>> --class org.apache.spark.examples.SparkPi \
>> --master yarn-cluster \
>> --driver-memory 512m \
>> --num-executors 2 \
>> --executor-memory 512m \
>> --executor-cores 2 \
>> hdfs://master:9000/user/shihj/spark_lib/spark-examples-1.6.
>> 1-hadoop2.6.0.jar
>>
>> On Wed, Jun 22, 2016 at 4:27 PM, 另一片天 <958943172@qq.com> wrote:
>>
>>> Is it able to run on local mode ?
>>>
>>> what mean?? standalone mode ?
>>>
>>>
>>> ------------------ 原始邮件 ------------------
>>> *发件人:* "Yash Sharma";<yash360@gmail.com>;
>>> *发送时间:* 2016年6月22日(星期三) 下午2:18
>>> *收件人:* "Saisai Shao"<sai.sai.shao@gmail.com>;
>>> *抄送:* "另一片天"<958943172@qq.com>; "user"<user@spark.apache.org>;
>>> *主题:* Re: Could not find or load main class
>>> org.apache.spark.deploy.yarn.ExecutorLauncher
>>>
>>> Try providing the jar with the hdfs prefix. Its probably just because
>>> its not able to find the jar on all nodes.
>>>
>>> hdfs://master:9000/user/shihj/spark_lib/spark-examples-1.6.
>>> 1-hadoop2.6.0.jar
>>>
>>> Is it able to run on local mode ?
>>>
>>> On Wed, Jun 22, 2016 at 4:14 PM, Saisai Shao <sai.sai.shao@gmail.com>
>>> wrote:
>>>
>>>> spark.yarn.jar (none) The location of the Spark jar file, in case
>>>> overriding the default location is desired. By default, Spark on YARN will
>>>> use a Spark jar installed locally, but the Spark jar can also be in a
>>>> world-readable location on HDFS. This allows YARN to cache it on nodes so
>>>> that it doesn't need to be distributed each time an application runs. To
>>>> point to a jar on HDFS, for example, set this configuration to
>>>> hdfs:///some/path.
>>>>
>>>> spark.yarn.jar is used for spark run-time system jar, which is spark
>>>> assembly jar, not the application jar (example-assembly jar). So in your
>>>> case you upload the example-assembly jar into hdfs, in which spark system
>>>> jars are not packed, so ExecutorLaucher cannot be found.
>>>>
>>>> Thanks
>>>> Saisai
>>>>
>>>> On Wed, Jun 22, 2016 at 2:10 PM, 另一片天 <958943172@qq.com> wrote:
>>>>
>>>>> shihj@master:/usr/local/spark/spark-1.6.1-bin-hadoop2.6$
>>>>> ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master
>>>>> yarn-client --driver-memory 512m --num-executors 2 --executor-memory
512m
>>>>> --executor-cores 2
>>>>> /user/shihj/spark_lib/spark-examples-1.6.1-hadoop2.6.0.jar 10
>>>>> Warning: Local jar
>>>>> /user/shihj/spark_lib/spark-examples-1.6.1-hadoop2.6.0.jar does not exist,
>>>>> skipping.
>>>>> java.lang.ClassNotFoundException: org.apache.spark.examples.SparkPi
>>>>> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>>>> at java.lang.Class.forName0(Native Method)
>>>>> at java.lang.Class.forName(Class.java:348)
>>>>> at org.apache.spark.util.Utils$.classForName(Utils.scala:174)
>>>>> at
>>>>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:689)
>>>>> at
>>>>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
>>>>> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
>>>>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>>>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>>> get error at once
>>>>> ------------------ 原始邮件 ------------------
>>>>> *发件人:* "Yash Sharma";<yash360@gmail.com>;
>>>>> *发送时间:* 2016年6月22日(星期三) 下午2:04
>>>>> *收件人:* "另一片天"<958943172@qq.com>;
>>>>> *抄送:* "user"<user@spark.apache.org>;
>>>>> *主题:* Re: Could not find or load main class
>>>>> org.apache.spark.deploy.yarn.ExecutorLauncher
>>>>>
>>>>> How about supplying the jar directly in spark submit -
>>>>>
>>>>> ./bin/spark-submit \
>>>>>> --class org.apache.spark.examples.SparkPi \
>>>>>> --master yarn-client \
>>>>>> --driver-memory 512m \
>>>>>> --num-executors 2 \
>>>>>> --executor-memory 512m \
>>>>>> --executor-cores 2 \
>>>>>> /user/shihj/spark_lib/spark-examples-1.6.1-hadoop2.6.0.jar
>>>>>
>>>>>
>>>>> On Wed, Jun 22, 2016 at 3:59 PM, 另一片天 <958943172@qq.com>
wrote:
>>>>>
>>>>>> i  config this  para  at spark-defaults.conf
>>>>>> spark.yarn.jar
>>>>>> hdfs://master:9000/user/shihj/spark_lib/spark-examples-1.6.1-hadoop2.6.0.jar
>>>>>>
>>>>>> then ./bin/spark-submit --class org.apache.spark.examples.SparkPi
>>>>>> --master yarn-client --driver-memory 512m --num-executors 2
>>>>>> --executor-memory 512m --executor-cores 2    10:
>>>>>>
>>>>>>
>>>>>>
>>>>>>    - Error: Could not find or load main class
>>>>>>    org.apache.spark.deploy.yarn.ExecutorLauncher
>>>>>>
>>>>>> but  i don't config that para ,there no error  why???that para is
>>>>>> only avoid Uploading resource file(jar package)??
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>

Mime
View raw message