spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akhil Das <ak...@sigmoidanalytics.com>
Subject Re: Build times for Spark
Date Fri, 25 Apr 2014 21:09:38 GMT
You can always increase the sbt memory by setting

export JAVA_OPTS="-Xmx10g"





Thanks
Best Regards


On Sat, Apr 26, 2014 at 2:17 AM, Williams, Ken
<Ken.Williams@windlogics.com>wrote:

>  No, I haven't done any config for SBT.  Is there somewhere you might be
> able to point me toward for how to do that?
>
>
>
> -Ken
>
>
>
> *From:* Josh Rosen [mailto:rosenville@gmail.com]
> *Sent:* Friday, April 25, 2014 3:27 PM
> *To:* user@spark.apache.org
> *Subject:* Re: Build times for Spark
>
>
>
> Did you configure SBT to use the extra memory?
>
>
>
> On Fri, Apr 25, 2014 at 12:53 PM, Williams, Ken <
> Ken.Williams@windlogics.com> wrote:
>
> I've cloned the github repo and I'm building Spark on a pretty beefy
> machine (24 CPUs, 78GB of RAM) and it takes a pretty long time.
>
>
>
> For instance, today I did a 'git pull' for the first time in a week or
> two, and then doing 'sbt/sbt assembly' took 43 minutes of wallclock time
> (88 minutes of CPU time).  After that, I did 'SPARK_HADOOP_VERSION=2.2.0
> SPARK_YARN=true sbt/sbt assembly' and that took 25 minutes wallclock, 73
> minutes CPU.
>
>
>
> Is that typical?  Or does that indicate some setup problem in my
> environment?
>
>
>
> --
>
> Ken Williams, Senior Research Scientist
>
> *WindLogics*
>
> http://windlogics.com
>
>
>
>
>  ------------------------------
>
>
> CONFIDENTIALITY NOTICE: This e-mail message is for the sole use of the
> intended recipient(s) and may contain confidential and privileged
> information. Any unauthorized review, use, disclosure or distribution of
> any kind is strictly prohibited. If you are not the intended recipient,
> please contact the sender via reply e-mail and destroy all copies of the
> original message. Thank you.
>
>
>

Mime
View raw message