spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From David McWhorter <mcwhor...@ccri.com>
Subject configuring spark.yarn.driver.memoryOverhead on Spark 1.2.0
Date Mon, 12 Jan 2015 16:01:26 GMT
Hi all,

I'm trying to figure out how to set this option: " 
spark.yarn.driver.memoryOverhead" on Spark 1.2.0.  I found this helpful 
overview 
http://apache-spark-user-list.1001560.n3.nabble.com/Stable-spark-streaming-app-td14105.html#a14476,

which suggests to launch with --spark.yarn.driver.memoryOverhead 1024 
added to spark-submit. However, when I do that I get this error:
Error: Unrecognized option '--spark.yarn.driver.memoryOverhead'.
Run with --help for usage help or --verbose for debug output
I have also tried calling 
sparkConf.set("spark.yarn.driver.memoryOverhead", "1024") on my spark 
configuration object but I still get "Will allocate AM container, with 
XXXX MB memory including 384 MB overhead" when launching.  I'm running 
in yarn-cluster mode.

Any help or tips would be appreciated.

Thanks,
David

-- 

David McWhorter
Software Engineer
Commonwealth Computer Research, Inc.
1422 Sachem Place, Unit #1
Charlottesville, VA 22901
mcwhorter@ccri.com | 434.299.0090x204


Mime
View raw message