spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Kevin Burton <bur...@spinn3r.com>
Subject Re: spark ignoring all memory settings and defaulting to 512MB?
Date Thu, 01 Jan 2015 18:20:15 GMT
ok.. we need to get these centralized somewhere as the documentation for
spark-env.sh sends people far far far off in the wrong direction.  maybe
remove all the directives in that script in favor of a link to a page that
is more live and can be updated?

Kevin

On Thu, Jan 1, 2015 at 12:43 AM, Sean Owen <sowen@cloudera.com> wrote:

> You don't in general configure Spark with environment variables. They
> exist but largely for backwards compatibility. Use arguments like
> --executor-memory on spark-submit, which are explained in the docs and the
> help message. It is possible to directly set the system properties with -D
> too if you need to do so directly. You don't have to change your app.
> Executor memory does not have to be set this way but you could.
> On Jan 1, 2015 6:36 AM, "Kevin Burton" <burton@spinn3r.com> wrote:
>
>> This is really weird and I’m surprised no one has found this issue yet.
>>
>> I’ve spent about an hour or more trying to debug this :-(
>>
>> My spark install is ignoring ALL my memory settings.  And of course my
>> job is running out of memory.
>>
>> The default is 512MB so pretty darn small.
>>
>> The worker and master start up and both use 512M
>>
>> This alone is very weird and poor documentation IMO because:
>>
>>  "SPARK_WORKER_MEMORY, to set how much total memory workers have to give
>> executors (e.g. 1000m, 2g)”
>>
>> … so if it’s giving it to executors, AKA the memory executors run with,
>> then it should be SPARK_EXECUTOR_MEMORY…
>>
>> … and the worker actually uses SPARK_DAEMON memory.
>>
>> but actually I’m right.  It IS SPARK_EXECUTOR_MEMORY… according to
>> bin/spark-class
>>
>> … but, that’s not actually being used :-(
>>
>> that setting is just flat out begin ignored and it’s just using 512MB.
>> So all my jobs fail.
>>
>> … and I write an ‘echo’ so I could trace the spark-class script to see
>> what the daemons are actually being run with and spark-class wasn’t being
>> called with and nothing is logged for the coarse grained executor.  I guess
>> it’s just inheriting the JVM opts from it’s parent and Java is launching
>> the process directly?
>>
>> This is a nightmare :(
>>
>> --
>>
>> Founder/CEO Spinn3r.com
>> Location: *San Francisco, CA*
>> blog: http://burtonator.wordpress.com
>> … or check out my Google+ profile
>> <https://plus.google.com/102718274791889610666/posts>
>> <http://spinn3r.com>
>>
>>


-- 

Founder/CEO Spinn3r.com
Location: *San Francisco, CA*
blog: http://burtonator.wordpress.com
… or check out my Google+ profile
<https://plus.google.com/102718274791889610666/posts>
<http://spinn3r.com>

Mime
View raw message