spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Hao Wang <wh.s...@gmail.com>
Subject Re: java.lang.NoClassDefFoundError: org/apache/spark/deploy/worker/Worker
Date Mon, 19 May 2014 15:32:04 GMT
I made a mistake, that machines in my cluster run different JDKs. After I
unify all the JDKs, the problem is solved.

Regards,
Wang Hao(王灏)

CloudTeam | School of Software Engineering
Shanghai Jiao Tong University
Address:800 Dongchuan Road, Minhang District, Shanghai, 200240
Email:wh.sjtu@gmail.com


On Sun, May 18, 2014 at 1:52 PM, Hao Wang <wh.sjtu@gmail.com> wrote:

>  Hi, all
>
> *Spark version: bae07e3 [behind 1] fix different versions of commons-lang
> dependency and apache/spark#746 addendum*
>
> I have six worker nodes and four of them have this NoClassDefFoundError when
> I use thestart-slaves.sh on my driver node. However, running ./bin/spark-class
> org.apache.spark.deploy.worker.Worker spark://MASTER_IP:PORT on the
> worker nodes works well.
>
> I compile the /spark directory on driver node and distribute to all the
> worker nodes. Paths on different nodes are identical.
>
> Here is the logs from one of four driver nodes.
>
> Spark Command: java -cp ::/home/wanghao/spark/conf:/home/wanghao/spark/assembly/target/scala-2.10/spark-assembly-1.0.0-SNAPSHOT-hadoop2.2.0.jar
-Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m org.apache.spark.deploy.worker.Worker
spark://192.168.1.12:7077 --webui-port 8081
> ========================================
>
> Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/deploy/worker/Worker
> Caused by: java.lang.ClassNotFoundException: org.apache.spark.deploy.worker.Worker
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:323)
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:268)
> Could not find the main class: org.apache.spark.deploy.worker.Worker. Program will exit.
>
> Here is spark-env.sh
>
> export SPARK_WORKER_MEMORY=1g
> export SPARK_MASTER_IP=192.168.1.12
> export SPARK_MASTER_PORT=7077
> export SPARK_WORKER_CORES=1
> export SPARK_WORKER_INSTANCES=2
>
> hosts file:
>
> 127.0.0.1       localhost
> 192.168.1.12    sing12
>
> # The following lines are desirable for IPv6 capable hosts
> ::1     ip6-localhost ip6-loopback
> fe00::0 ip6-localnet
> ff00::0 ip6-mcastprefix
> ff02::1 ip6-allnodes
> ff02::2 ip6-allrouters
>
> 192.168.1.11 sing11
> 192.168.1.59 sing59
>
> ###################
> # failed machines
> ###################
>
> 192.168.1.122 host122
> 192.168.1.123 host123
> 192.168.1.124 host124
> 192.168.1.125 host125
>
>
> Regards,
> Wang Hao(王灏)
>
> CloudTeam | School of Software Engineering
> Shanghai Jiao Tong University
> Address:800 Dongchuan Road, Minhang District, Shanghai, 200240
> Email:wh.sjtu@gmail.com
>

Mime
View raw message