spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Wei Tan <w...@us.ibm.com>
Subject Re: best practice: write and debug Spark application in scala-ide and maven
Date Sun, 08 Jun 2014 06:01:57 GMT
Thank you all, Madhu, Gerard and Ryan. All your suggestions work. 
Personally I prefer running Spark locally in Eclipse for debugging 
purpose.

Best regards,
Wei

---------------------------------
Wei Tan, PhD
Research Staff Member
IBM T. J. Watson Research Center
http://researcher.ibm.com/person/us-wtan



From:   Madhu <madhu@madhu.com>
To:     user@spark.incubator.apache.org, 
Date:   06/07/2014 05:21 PM
Subject:        Re: best practice: write and debug Spark application in 
scala-ide and maven



For debugging, I run locally inside Eclipse without maven.
I just add the Spark assembly jar to my Eclipse project build path and 
click
'Run As... Scala Application'.
I have done the same with Java and Scala Test, it's quick and easy.
I didn't see any third party jar dependencies in your code, so that should
be sufficient for your example.



-----
Madhu
https://www.linkedin.com/in/msiddalingaiah
--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/best-practice-write-and-debug-Spark-application-in-scala-ide-and-maven-tp7151p7183.html

Sent from the Apache Spark User List mailing list archive at Nabble.com.



Mime
View raw message