spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jakob Odersky <>
Subject Re: Building spark submodule source code
Date Mon, 21 Mar 2016 19:08:57 GMT
Another gotcha to watch out for are the SPARK_* environment variables.
Have you exported SPARK_HOME? In that case, 'spark-shell' will use
Spark from the variable, regardless of the place the script is called
I.e. if SPARK_HOME points to a release version of Spark, your code
changes will never be available by simply running 'spark-shell'.

On Sun, Mar 20, 2016 at 11:23 PM, Akhil Das <> wrote:
> Have a look at the intellij setup
> Once you have the setup ready, you don't have to recompile the whole stuff
> every time.
> Thanks
> Best Regards
> On Mon, Mar 21, 2016 at 8:14 AM, Tenghuan He <> wrote:
>> Hi everyone,
>>     I am trying to add a new method to spark RDD. After changing the code
>> of RDD.scala and running the following command
>>     mvn -pl :spark-core_2.10 -DskipTests clean install
>>     It BUILD SUCCESS, however, when starting the bin\spark-shell, my
>> method cannot be found.
>>     Do I have to rebuild the whole spark project instead the spark-core
>> submodule to make the changes work?
>>     Rebuiling the whole project is too time consuming, is there any better
>> choice?
>> Thanks & Best Regards
>> Tenghuan He

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message