spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From <Hussam_Jar...@Dell.com>
Subject Error: Could not find or load main class org.apache.spark.executor.CoarseGrainedExecutorBackend
Date Tue, 21 Jan 2014 00:44:06 GMT
Hi,

I am using spark 0.8.0 when hadoop 1.2.1 on Standalone cluster mode with 3 worker nodes and
1 master.

Can someone help me on this error I am getting when running my app in a spark cluster ?
Error: Could not find or load main class org.apache.spark.executor.CoarseGrainedExecutorBackend

Command on the worker node is

Spark Executor Command: "java" "-cp" ":/opt/spark-0.8.0/conf:/opt/spark-0.8.0/assembly/target/scala-2.9.3/spark-assembly_2.9.3-0.8.0-incubating-hadoop1.0.4.jar"
"-Dspark.local.dir=/home/hadoop/spark" "-Dspark.local.dir=/home/hadoop/spark" "-Dspark.local.dir=/home/hadoop/spark"
"-Dspark.local.dir=/home/hadoop/spark" "-Xms49152M" "-Xmx49152M" "org.apache.spark.executor.CoarseGrainedExecutorBackend"
"akka://spark@poc1:54483/user/CoarseGrainedScheduler" "2" "poc3" "16"

I checked logs on spark master as well spark workers but not much info except above error.

Thanks,
Hussam

Mime
View raw message