spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akshat Aranya <aara...@gmail.com>
Subject Re: Relation between worker memory and executor memory in standalone mode
Date Wed, 01 Oct 2014 18:33:36 GMT
On Wed, Oct 1, 2014 at 11:00 AM, Boromir Widas <vcsubsvc@gmail.com> wrote:

> 1. worker memory caps executor.
> 2. With default config, every job gets one executor per worker. This
> executor runs with all cores available to the worker.
>
> By the job do you mean one SparkContext or one stage execution within a
program?  Does that also mean that two concurrent jobs will get one
executor each at the same time?


>
> On Wed, Oct 1, 2014 at 11:04 AM, Akshat Aranya <aaranya@gmail.com> wrote:
>
>> Hi,
>>
>> What's the relationship between Spark worker and executor memory settings
>> in standalone mode?  Do they work independently or does the worker cap
>> executor memory?
>>
>> Also, is the number of concurrent executors per worker capped by the
>> number of CPU cores configured for the worker?
>>
>
>

Mime
View raw message