spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Cheng Lian <>
Subject Re: IntelliJ Runtime error
Date Sat, 04 Apr 2015 12:46:27 GMT
I found in general it's a pain to build/run Spark inside IntelliJ IDEA. 
I guess most people resort to this approach so that they can leverage 
the integrated debugger to debug and/or learn Spark internals. A more 
convenient way I'm using recently is resorting to the remote debugging 
feature. In this way, by adding driver/executor Java options, you may 
build and start the Spark applications/tests/daemons in the normal way 
and attach the debugger to it. I was using this to debug the 
HiveThriftServer2, and it worked perfectly.

Steps to enable remote debugging:

1. Menu "Run / Edit configurations..."
2. Click the "+" button, choose "Remote"
3. Choose "Attach" or "Listen" in "Debugger mode" according to your 
actual needs
4. Copy, edit, and add Java options suggested in the dialog to 
`--driver-java-options` or `--executor-java-options`
5. If you're using attaching mode, first start your Spark program, then 
start remote debugging in IDEA
6. If you're using listening mode, first start remote debugging in IDEA, 
and then start your Spark program.

Hope this can be helpful.


On 4/4/15 12:54 AM, sara mustafa wrote:
> Thank you, it works with me when I changed the dependencies from provided to
> compile.
> --
> View this message in context:
> Sent from the Apache Spark Developers List mailing list archive at
> ---------------------------------------------------------------------
> To unsubscribe, e-mail:
> For additional commands, e-mail:

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message