spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Evan Chan ...@ooyala.com>
Subject Re: Development environments
Date Wed, 09 Oct 2013 07:45:57 GMT
Once you have compiled everything the first time using SBT (assembly will
do that for you), successive runs of assembly are much faster.  I just did
it on my MacBook Pro in about 36 seconds.

Running builds using IntelliJ or an IDE is wasted time, because the
compiled classes go to a different place than SBT.   Maybe there's some way
to symlink them.

-Evan



On Tue, Oct 8, 2013 at 6:29 AM, Markus Losoi <markus.losoi@gmail.com> wrote:

> > Hi Markus,
>
> > have a look at the bottom of this wiki page:
>
> > https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark
>
> > IntelliJ IDEA seems to be quite popular (that I am using myself)
> > although Eclipse should work fine, too. There is another sbt plugin for
> > generating Eclipse project files.
>
> The IDE seems to work nicely, but what is the fastest way to build Spark?
> If
> I make a change to the "core" module and choose "Make Module 'core'" from
> the "Build" menu in IntelliJ Idea, then the IDE compiles the source code.
> To
> create the "spark-assembly-0.8.0-incubating-hadoop1.0.4.jar" JAR file, I
> have run "sbt assembly" on the command line. However, this takes an
> impractically long time (843 s when I last ran it on my workstation with an
> Intel Core 2 Quad Q9400 and 8 GB of RAM). Is there any faster way?
>
> Best regards,
> Markus Losoi (markus.losoi@gmail.com)
>
>


-- 
--
Evan Chan
Staff Engineer
ev@ooyala.com  |

<http://www.ooyala.com/>
<http://www.facebook.com/ooyala><http://www.linkedin.com/company/ooyala><http://www.twitter.com/ooyala>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message