Hi Eric,

Once you install IntelliJ IDE, follow this simple post to get started with scala. Now you have an IDE which works with Scala, Then for Spark Follow the below steps:

1. Install sbt plugins: Goto File -> Settings -> Plugins -> Install IntelliJ Plugins -> Search for sbt and install it
Inline image 1

2. After sbt plugin install, restart intellij and Start New Scala sbt project (File -> New Project -> Scala -> SBT
Inline image 2

Inline image 3

3. Now open up the build.sbt file and add the all the dependencies(Here i'm adding spark 1.1.0 with hadoop 2.4.0 dependency)
Inline image 4

4. Now Create a new Scala class in src -> main -> scala and type in your code.
Inline image 6

5. Right click and hit Run :)
Inline image 7

Inline image 8

6. Now to debug: Put break points at all those points wherever you need your execution to get paused and Goto Run -> Debug (or Shift + F9 )

Hope it helps.

Best Regards

On Tue, Oct 28, 2014 at 2:33 AM, Eric Tanner <eric.tanner@justenough.com> wrote:
I am a Scala / Spark newbie (attending Paco Nathan's class).  

What I need is some advice as to how to set up intellij (or eclipse) to be able to attache to the process executing to the debugger.  I know that this is not feasible if the code is executing within the cluster.  However, if spark is running locally (on my laptop) I would like to attach the debugger process to the spark program that is running locally to be able to step through the program.

Any advice will be is helpful.



Eric Tanner
Big Data Developer

JustEnough Logo

15440 Laguna Canyon, Suite 100

Irvine, CA 92618



  +1 (951) 313-9274
  +1 (949) 706-0400

Confidentiality Note: The information contained in this email and document(s) attached are for the exclusive use of the addressee and may contain confidential, privileged and non-disclosable information. If the recipient of this email is not the addressee, such recipient is strictly prohibited from reading, photocopying, distribution or otherwise using this email or its contents in any way.