spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Greg Hill <greg.h...@RACKSPACE.COM>
Subject Spark on YARN driver memory allocation bug?
Date Wed, 08 Oct 2014 19:12:51 GMT
So, I think this is a bug, but I wanted to get some feedback before I reported it as such.
 On Spark on YARN, 1.1.0, if you specify the --driver-memory value to be higher than the memory
available on the client machine, Spark errors out due to failing to allocate enough memory.
 This happens even in yarn-cluster mode.  Shouldn't it only allocate that memory on the YARN
node that is going to run the driver process, not the local client machine?

Greg


Mime
View raw message