spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Laurent Thoulon <laurent.thou...@ldmobile.net>
Subject Packaging a spark job using maven
Date Mon, 12 May 2014 15:41:53 GMT
Hi, 

I'm quite new to spark (and scala) but has anyone ever successfully compiled and run a spark
job using java and maven ? 
Packaging seems to go fine but when i try to execute the job using 

mvn package 
java -Xmx4g -cp target/jobs-1.4.0.0-jar-with-dependencies.jar my.jobs.spark.TestJob 

I get the following error 
Exception in thread "main" com.typesafe.config.ConfigException$Missing: No configuration setting
found for key 'akka.version' 
at com.typesafe.config.impl.SimpleConfig.findKey(SimpleConfig.java:115) 
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:136) 
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:142) 
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:150) 
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:155) 
at com.typesafe.config.impl.SimpleConfig.getString(SimpleConfig.java:197) 
at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:136) 
at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:470) 
at akka.actor.ActorSystem$.apply(ActorSystem.scala:111) 
at akka.actor.ActorSystem$.apply(ActorSystem.scala:104) 
at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:96) 
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:126) 
at org.apache.spark.SparkContext.<init>(SparkContext.scala:139) 
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:47) 
at my.jobs.spark. TestJob .run( TestJob .java:56) 


Here's the code right until line 56 

SparkConf conf = new SparkConf() 
.setMaster("local[" + cpus + "]") 
.setAppName(this.getClass().getSimpleName()) 
.setSparkHome("/data/spark") 
.setJars(JavaSparkContext.jarOfClass(this.getClass())) 
.set("spark.default.parallelism", String.valueOf(cpus * 2)) 
.set("spark.executor.memory", "4g") 
.set("spark.storage.memoryFraction", "0.6") 
.set("spark.shuffle.memoryFraction", "0.3"); 
JavaSparkContext sc = new JavaSparkContext(conf); 


Thanks 
Regards, 
Laurent 

Mime
View raw message