spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sean Owen <so...@cloudera.com>
Subject Re: configuring spark.yarn.driver.memoryOverhead on Spark 1.2.0
Date Mon, 12 Jan 2015 16:23:35 GMT
Isn't the syntax "--conf property=value"?

http://spark.apache.org/docs/latest/configuration.html

Yes, I think setting it after the driver is running is of course too late.

On Mon, Jan 12, 2015 at 4:01 PM, David McWhorter <mcwhorter@ccri.com> wrote:
> Hi all,
>
> I'm trying to figure out how to set this option: "
> spark.yarn.driver.memoryOverhead" on Spark 1.2.0.  I found this helpful
> overview
> http://apache-spark-user-list.1001560.n3.nabble.com/Stable-spark-streaming-app-td14105.html#a14476,
> which suggests to launch with --spark.yarn.driver.memoryOverhead 1024 added
> to spark-submit.  However, when I do that I get this error:
> Error: Unrecognized option '--spark.yarn.driver.memoryOverhead'.
> Run with --help for usage help or --verbose for debug output
> I have also tried calling sparkConf.set("spark.yarn.driver.memoryOverhead",
> "1024") on my spark configuration object but I still get "Will allocate AM
> container, with XXXX MB memory including 384 MB overhead" when launching.
> I'm running in yarn-cluster mode.
>
> Any help or tips would be appreciated.
>
> Thanks,
> David
>
> --
>
> David McWhorter
> Software Engineer
> Commonwealth Computer Research, Inc.
> 1422 Sachem Place, Unit #1
> Charlottesville, VA 22901
> mcwhorter@ccri.com | 434.299.0090x204

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message