spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Andrew Or <and...@databricks.com>
Subject Re: Spark on YARN driver memory allocation bug?
Date Wed, 08 Oct 2014 20:25:20 GMT
Hi Greg,

It does seem like a bug. What is the particular exception message that you
see?

Andrew

2014-10-08 12:12 GMT-07:00 Greg Hill <greg.hill@rackspace.com>:

>  So, I think this is a bug, but I wanted to get some feedback before I
> reported it as such.  On Spark on YARN, 1.1.0, if you specify the
> --driver-memory value to be higher than the memory available on the client
> machine, Spark errors out due to failing to allocate enough memory.  This
> happens even in yarn-cluster mode.  Shouldn't it only allocate that memory
> on the YARN node that is going to run the driver process, not the local
> client machine?
>
>  Greg
>
>

Mime
View raw message