spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Chester Chen <chesterxgc...@yahoo.com>
Subject Re: "sbt/sbt run" command returns a JVM problem
Date Thu, 01 May 2014 13:57:05 GMT
You might want to check the memory settings in sbt itself, which its a shell scripts run a
java command. I don't have computer at hand, but if you vim or cat the sbt/sbt , you might
see the memory settings , you change it to fit your need

You might also can overwrite the setting by change .sbtopts  without change the script , but
google it for sure.

Chester

Sent from my iPhone

On May 1, 2014, at 6:47 AM, Carter <gyzhen@hotmail.com> wrote:

> Hi, I have a very simple spark program written in Scala:
> /*** testApp.scala ***/
> object testApp {
>  def main(args: Array[String]) {
>    println("Hello! World!")
>  }
> }
> Then I use the following command to compile it:
> $ sbt/sbt package
> The compilation finished successfully and I got a JAR file.
> But when I use this command to run it:
> $ sbt/sbt run
> it returned an error with JVM:
> [info] Error occurred during initialization of VM 
> [info] Could not reserve enough space for object heap 
> [error] Error: Could not create the Java Virtual Machine. 
> [error] Error: A fatal exception has occurred. Program will exit. 
> java.lang.RuntimeException: Nonzero exit code returned from runner: 1    
> at scala.sys.package$.error(package.scala:27)
> 
> My machine has 2G memory, and runs on Ubuntu 11.04. I also tried to change
> the setting of java parameter (e.g.,  -Xmx, -Xms, -XX:MaxPermSize,
> -XX:ReservedCodeCacheSize) in the file sbt/sbt, but it looks like non of the
> change works. Can anyone help me out with this problem? Thank you very much.
> 
> 
> 
> --
> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/sbt-sbt-run-command-returns-a-JVM-problem-tp5157.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.

Mime
View raw message