spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akhil Das <ak...@sigmoidanalytics.com>
Subject Re: Building spark submodule source code
Date Mon, 21 Mar 2016 06:23:04 GMT
Have a look at the intellij setup
https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools#UsefulDeveloperTools-IntelliJ
Once you have the setup ready, you don't have to recompile the whole stuff
every time.

Thanks
Best Regards

On Mon, Mar 21, 2016 at 8:14 AM, Tenghuan He <tenghuanhe@gmail.com> wrote:

> Hi everyone,
>
>     I am trying to add a new method to spark RDD. After changing the code
> of RDD.scala and running the following command
>     mvn -pl :spark-core_2.10 -DskipTests clean install
>     It BUILD SUCCESS, however, when starting the bin\spark-shell, my
> method cannot be found.
>     Do I have to rebuild the whole spark project instead the spark-core
> submodule to make the changes work?
>     Rebuiling the whole project is too time consuming, is there any better
> choice?
>
>
> Thanks & Best Regards
>
> Tenghuan He
>
>

Mime
View raw message