spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Shing Hing Man <mat...@yahoo.com.INVALID>
Subject Re: Spark 1.0.2 Can GroupByTest example be run in Eclipse without change
Date Sun, 07 Sep 2014 08:14:23 GMT
After looking at the source code of SparkConf.scala,   I found the following solution.
Just set the following Java system property :

-Dspark.master=local

Shing



On Monday, 1 September 2014, 22:09, Shing Hing Man <matmsh@yahoo.com.INVALID> wrote:
 


Hi, 

I have noticed that the GroupByTest example in
https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/GroupByTest.scala
has been changed to  be run using spark-submit. 
Previously,  I set "local" as the first command line parameter, and this enable me to run
GroupByTest in Eclipse. 
val sc = new SparkContext(args(0), "GroupBy Test",
System.getenv("SPARK_HOME"), SparkContext.jarOfClass(this.getClass).toSeq)


In the latest GroupByTest code, I can not pass in "local" as the first comand line parameter
: 
val sparkConf = new SparkConf().setAppName("GroupBy Test")
var numMappers = if (args.length > 0) args(0).toInt else 2
var numKVPairs = if (args.length > 1) args(1).toInt else 1000
var valSize = if (args.length > 2) args(2).toInt else 1000
var numReducers = if (args.length > 3) args(3).toInt else numMappers
val sc = new SparkContext(sparkConf)


Is there a way to specify  "master=local" (maybe in an environment variable), so that I can
run the latest 
version of GroupByTest in Eclipse without changing the code. 

Thanks in advance for your assistance !

Shing 
Mime
View raw message