Hi, You should include the jar file of your project. for example: conf.set("yourjarfilepath.jar")

Joe
On Friday, May 2, 2014 7:39 AM, proofmoore [via Apache Spark User List] <[hidden email]> wrote:
HelIo. I followed "A Standalone App in Java" part of the tutorial https://spark.apache.org/docs/0.8.1/quick-start.html

Spark standalone cluster looks it's running without a problem : http://i.stack.imgur.com/7bFv8.png

I have built a fat jar for running this JavaApp on the cluster. Before maven package: 
   
    find .
    
    ./pom.xml
    ./src
    ./src/main
    ./src/main/java
    ./src/main/java/SimpleApp.java


content of SimpleApp.java is :

     import org.apache.spark.api.java.*;
     import org.apache.spark.api.java.function.Function;
     import org.apache.spark.SparkConf;
     import org.apache.spark.SparkContext;


     public class SimpleApp {
     public static void main(String[] args) {

     SparkConf conf =  new SparkConf()
                       .setMaster("spark://10.35.23.13:7077")
                       .setAppName("My app")
                       .set("spark.executor.memory", "1g");

     JavaSparkContext   sc = new JavaSparkContext (conf);
     String logFile = "/home/ubuntu/spark-0.9.1/test_data";
     JavaRDD<String> logData = sc.textFile(logFile).cache();

     long numAs = logData.filter(new Function<String, Boolean>() {
      public Boolean call(String s) { return s.contains("a"); }
     }).count();

     System.out.println("Lines with a: " + numAs); 
     }
     }
 
This program only works when master is set as setMaster("local"). Otherwise I get this error : http://i.stack.imgur.com/doRSn.png

Thanks,
Ibrahim



If you reply to this email, your message will be added to the discussion below:
http://apache-spark-user-list.1001560.n3.nabble.com/java-lang-ClassNotFoundException-tp5191.html
To start a new topic under Apache Spark User List, email [hidden email]
To unsubscribe from Apache Spark User List, click here.
NAML




View this message in context: Re: java.lang.ClassNotFoundException
Sent from the Apache Spark User List mailing list archive at Nabble.com.