spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From İbrahim Rıza HALLAÇ <>
Subject java.lang.ClassNotFoundException
Date Thu, 01 May 2014 22:37:10 GMT

HelIo. I followed "A Standalone App in Java" part of the tutorial
Spark standalone cluster looks it's running without a problem :
I have built a fat jar for running this JavaApp on the cluster. Before maven package:    
   find .        ./pom.xml    ./src    ./src/main    ./src/main/java    ./src/main/java/

content of is :
     import*;     import;
    import org.apache.spark.SparkConf;     import org.apache.spark.SparkContext;

     public class SimpleApp {     public static void main(String[] args) {
     SparkConf conf =  new SparkConf()                       .setMaster("spark://")
                      .setAppName("My app")                       .set("spark.executor.memory",
     JavaSparkContext   sc = new JavaSparkContext (conf);     String logFile = "/home/ubuntu/spark-0.9.1/test_data";
    JavaRDD<String> logData = sc.textFile(logFile).cache();
     long numAs = logData.filter(new Function<String, Boolean>() {      public Boolean
call(String s) { return s.contains("a"); }     }).count();
     System.out.println("Lines with a: " + numAs);      }     } This program only works when
master is set as setMaster("local"). Otherwise I get this error :
View raw message