spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Chester Chen <>
Subject Re: "sbt/sbt run" command returns a JVM problem
Date Thu, 01 May 2014 14:38:58 GMT
Here is the options defined in sbt/sbt

JAVA_OPTS          environment variable, if unset uses "$java_opts"
SBT_OPTS           environment variable, if unset uses "$default_sbt_opts"
.sbtopts           if this file exists in the current directory, it is
prepended to the runner args
/etc/sbt/sbtopts   if this file exists, it is prepended to the runner args

On Thursday, May 1, 2014 7:17 AM, Sean Owen <> wrote:
Here's how I configure SBT, which I think is the usual way:

export SBT_OPTS="-XX:+CMSClassUnloadingEnabled -XX:MaxPermSize=256m -Xmx1g"

See if that takes. But your error is that you're already asking for
too much memory for your machine. So maybe you are setting the value
successfully, but it's not valid. How big?

On Thu, May 1, 2014 at 2:57 PM, Chester Chen <> wrote:
> You might want to check the memory settings in sbt itself, which its a shell scripts
run a java command. I don't have computer at hand, but if you vim or cat the sbt/sbt , you
might see the memory settings , you change it to fit your need
> You might also can overwrite the setting by change .sbtopts  without change the script
, but google it for sure.
> Chester
> Sent from my iPhone
> On May 1, 2014, at 6:47 AM, Carter <> wrote:
>> Hi, I have a very simple spark program written in Scala:
>> /*** testApp.scala ***/
>> object testApp {
>>  def main(args: Array[String]) {
>>    println("Hello! World!")
>>  }
>> }
>> Then I use the following command to compile it:
>> $ sbt/sbt package
>> The compilation finished successfully and I got a JAR file.
>> But when I use this command to run it:
>> $ sbt/sbt run
>> it returned an error with JVM:
>> [info] Error occurred during initialization of VM
>> [info] Could not reserve enough space for object heap
>> [error] Error: Could not create the Java Virtual Machine.
>> [error] Error: A fatal exception has occurred. Program will exit.
>> java.lang.RuntimeException: Nonzero exit code returned from runner: 1
>> at scala.sys.package$.error(package.scala:27)
>> My machine has 2G memory, and runs on Ubuntu 11.04. I also tried to change
>> the setting of java parameter (e.g.,  -Xmx, -Xms, -XX:MaxPermSize,
>> -XX:ReservedCodeCacheSize) in the file sbt/sbt, but it looks like non of the
>> change works. Can anyone help me out with this problem? Thank you very much.
>> --
>> View this message in context:
>> Sent from the Apache Spark User List mailing list archive at
View raw message