spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From CodingCat <...@git.apache.org>
Subject [GitHub] incubator-spark pull request: [SPARK-1090] improvement on spark_sh...
Date Fri, 14 Feb 2014 22:49:47 GMT
Github user CodingCat commented on the pull request:

    https://github.com/apache/incubator-spark/pull/599#discussion_r9767385
  
    Hi, @aarondav, just a bit confused. from the code
    
    private[spark] val executorMemory = conf.getOption("spark.executor.memory")
        .orElse(Option(System.getenv("SPARK_MEM")))
        .map(Utils.memoryStringToMb)
        .getOrElse(512)
    
    SPARK_MEM is to set the memory used by the executor, which has been done in this PR, 
    
    what you are proposing is to control the memory used by the driver. I think it is hard
to achieve since users may run spark-shell in a machine out of the control of Spark.


Mime
View raw message