Thanks Fred for pointers...so far I was only able to build 2.1 with Java 7 and no zinc.
Will try options u suggest. FYI building with sbt ends up in oom even with Java 7
I will try and update this thread
There's no option to prevent build/mvn from starting the zinc server, but you should be able to prevent the maven build from using the zinc server by changing the <useZincServer> option at line 1935 of the master pom.xml.Note that the zinc-based compile works on my Ubuntu 16.04 box. You might be able to get zinc-based compiles working by tweaking your settings. A few things to try:-- Make sure another build hasn't left a second, incompatible copy of zinc squatting on the port that Spark expects to use-- Try setting the environment variable JAVA_7_HOME to point to and OpenJDK 7 installation. build/mvn runs zinc with Java 7 if that is available.Note that setting JAVA_7_HOME will break incremental compilation for sbt-based builds. Use that environment variable with restraint.FredOn Thu, Oct 6, 2016 at 2:22 AM, Marco Mistroni <email@example.com> wrote:
The build/mvn will trigger compilation using zinc and I want to avoid that as every time I have tried it runs into errors while compiling spark core. How can I disable zinc by default?
KrOn 5 Oct 2016 10:53 pm, "Fred Reiss" <firstname.lastname@example.org> wrote:Actually the memory options *are* required for Java 1.8. Without them the build will fail intermittently. We just updated the documentation with regard to this fact in Spark 2.0.1. Relevant PR is here: https://github.com/apach
e/spark/pull/15005Your best bet as the project transitions from Java 7 to Java 8 is to use the scripts build/mvn and build/sbt, which should be updated on a regular basis with safe JVM options.FredOn Wed, Oct 5, 2016 at 1:40 AM, Marco Mistroni <email@example.com> wrote:
Thanks Richard. It also says that for Java 1.8 the mavenopts are not required..unless I misinterpreted the instructions...
KrOn 5 Oct 2016 9:20 am, "Richard Siebeling" <firstname.lastname@example.org> wrote:sorry, now with the link included, see http://spark.apache.org/do
cs/latest/building-spark.htmlOn Wed, Oct 5, 2016 at 10:19 AM, Richard Siebeling <email@example.com> wrote:Hi,did you set the following option: export MAVEN_OPTS="-Xmx2g -XX:ReservedCodeCacheSize=512m "kind regards,RichardOn Tue, Oct 4, 2016 at 10:21 PM, Marco Mistroni <firstname.lastname@example.org> wrote:Hi allmy mvn build of Spark 2.1 using Java 1.8 is spinning out of memory with an error saying it cannot allocate enough memory during maven compilationInstructions (in the Spark 2.0 page) says that MAVENOPTS are not needed for Java 1.8 and , accoding to my understanding, spark build process will add itduring the build via mvnNote; i am not using Zinc. Rather, i am using my own Maven version (3.3.9), launching this command from the main spark directory. The same build works when i use Java 1.7(and MAVENOPTS)mvn -Pyarn -Dscala-2.11 -DskipTests clean packageCould anyone assist?krmarco