spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From 林武康 <vboylin1...@gmail.com>
Subject 答复: unable to build spark - sbt/sbt: line 50: killed
Date Sat, 22 Mar 2014 12:03:28 GMT
Large memory is need to build spark, I think you should make xmx larger, 2g for example.

-----原始邮件-----
发件人: "Bharath Bhushan" <manku.timma@outlook.com>
发送时间: ‎2014/‎3/‎22 12:50
收件人: "user@spark.apache.org" <user@spark.apache.org>
主题: unable to build spark - sbt/sbt: line 50: killed

I am getting the following error when trying to build spark. I tried various sizes for the
-Xmx and other memory related arguments to the java command line, but the assembly command
still fails.

$ sbt/sbt assembly
...
[info] Compiling 298 Scala sources and 17 Java sources to /vagrant/spark-0.9.0-incubating-bin-hadoop2/core/target/scala-2.10/classes...
sbt/sbt: line 50: 10202 Killed                  java -Xmx1900m -XX:MaxPermSize=1000m -XX:ReservedCodeCacheSize=256m
-jar ${JAR} "$@"

Versions of software:
Spark: 0.9.0 (hadoop2 binary)
Scala: 2.10.3
Ubuntu: Ubuntu 12.04.4 LTS - Linux vagrant-ubuntu-precise-64 3.2.0-54-generic
Java: 1.6.0_45 (oracle java 6)

I can still use the binaries in bin/ but I was just trying to check if "sbt/sbt assembly"
works fine.

-- Thanks
Mime
View raw message