Just a wild guess, but I had to exclude “javax.servlet.servlet-api” from my Hadoop dependencies to run a SparkContext.

In your build.sbt:

"org.apache.hadoop" % "hadoop-common" % “..." exclude("javax.servlet", "servlet-api"),
"org.apache.hadoop" % "hadoop-hdfs" % “..." exclude("javax.servlet", "servlet-api”)

(or whatever Hadoop deps you use)

If you're using Maven:

 <exclusions>
  <exclusion>
   <groupId>javax.servlet</groupId>
  <artifactId>servlet-api</artifactId>
...


On 31.10.2014, at 07:14, sivarani <whitefeathers.rs@gmail.com> wrote:

I tried running it but dint work

public static final SparkConf batchConf= new SparkConf();
String master = "spark://sivarani:7077";
String spark_home ="/home/sivarani/spark-1.0.2-bin-hadoop2/";
String jar = "/home/sivarani/build/Test.jar";
public static final JavaSparkContext batchSparkContext = new
JavaSparkContext(master,"SparkTest",spark_home,new String[] {jar});

public static void main(String args[]){
runSpark(0,"TestSubmit");}

public static void runSpark(int crit, String dataFile){
JavaRDD<String> logData = batchSparkContext.textFile(input, 10);
flatMap
maptoparr
reduceByKey
List<Tuple2&lt;String, Integer>> output1 = counts.collect();
       }


This works fine with spark-submit but when i tried to submit through code
LeadBatchProcessing.runSpark(0, "TestSubmit.csv");

I get this following error

HTTP Status 500 - javax.servlet.ServletException:
org.apache.spark.SparkException: Job aborted due to stage failure: Task
0.0:0 failed 4 times, most recent failure: TID 29 on host 172.18.152.36
failed for unknown reason
Job aborted due to stage failure: Task 0.0:0 failed 4 times, most recent
failure: TID 29 on host 172.18.152.36 failed for unknown reason Driver
stacktrace:



Any Advice on this?




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Submiting-Spark-application-through-code-tp17452p17797.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org