Thanks Fred for far I was only able to build 2.1 with Java 7 and no zinc.
Will try options u suggest. FYI building with sbt ends up in oom even with Java 7
I will try and update this thread

On 6 Oct 2016 8:58 pm, "Fred Reiss" <> wrote:
There's no option to prevent build/mvn from starting the zinc server, but you should be able to prevent the maven build from using the zinc server by changing the <useZincServer> option at line 1935 of the master pom.xml. 

Note that the zinc-based compile works on my Ubuntu 16.04 box. You might be able to get zinc-based compiles working by tweaking your settings. A few things to try:
-- Make sure another build hasn't left a second, incompatible copy of zinc squatting on the port that Spark expects to use
-- Try setting the environment variable JAVA_7_HOME to point to and OpenJDK 7 installation. build/mvn runs zinc with Java 7 if that is available.

Note that setting JAVA_7_HOME will break incremental compilation for sbt-based builds. Use that environment variable with restraint.


On Thu, Oct 6, 2016 at 2:22 AM, Marco Mistroni <> wrote:

Thanks Fred
The build/mvn will trigger compilation using zinc and I want to avoid that as every time I have tried it runs into errors while compiling spark core. How can I disable zinc by default?

On 5 Oct 2016 10:53 pm, "Fred Reiss" <> wrote:
Actually the memory options *are* required for Java 1.8. Without them the build will fail intermittently. We just updated the documentation with regard to this fact in Spark 2.0.1. Relevant PR is here:

Your best bet as the project transitions from Java 7 to Java 8 is to use the scripts build/mvn and build/sbt, which should be updated on a regular basis with safe JVM options.


On Wed, Oct 5, 2016 at 1:40 AM, Marco Mistroni <> wrote:

Thanks Richard.  It also says that for Java 1.8 the mavenopts are not required..unless I misinterpreted the instructions...

On 5 Oct 2016 9:20 am, "Richard Siebeling" <> wrote:
sorry, now with the link included, see 

On Wed, Oct 5, 2016 at 10:19 AM, Richard Siebeling <> wrote:

did you set the following option: export MAVEN_OPTS="-Xmx2g -XX:ReservedCodeCacheSize=512m"

kind regards,

On Tue, Oct 4, 2016 at 10:21 PM, Marco Mistroni <> wrote:
Hi all
 my mvn build of Spark 2.1 using Java 1.8 is spinning out of memory with an error saying it cannot allocate enough memory during maven compilation

Instructions (in the Spark 2.0 page) says that MAVENOPTS are not needed for Java 1.8 and , accoding to my understanding, spark build process will add it
during the build via mvn
Note; i am not using Zinc. Rather, i am using my own Maven version (3.3.9), launching this command from the main spark directory. The same build works when i use Java 1.7(and MAVENOPTS)

mvn -Pyarn -Dscala-2.11 -DskipTests clean package

Could anyone assist?