spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Connor Zanin <cnnr...@udel.edu>
Subject Re: How does the # of tasks affect # of threads?
Date Sat, 01 Aug 2015 21:38:20 GMT
1. I believe that the default memory (per executor) is 512m (from the
documentation)
2. I have increased the memory used by spark on workers in my launch script
when submitting the job
       (--executor-memory 124g)
3. The job completes successfully, it is the "road bumps" in the middle I
am concerned with

I would like insight into how Spark handle thread creation

On Sat, Aug 1, 2015 at 5:33 PM, Fabrice Sznajderman <fabszn@gmail.com>
wrote:

> Hello,
>
> I am not an expert with Spark, but the error thrown by spark seems
> indicate that not enough memory for launching job. By default, Spark
> allocated 1GB for memory, may be you should increase it ?
>
> Best regards
>
> Fabrice
>
> Le sam. 1 août 2015 à 22:51, Connor Zanin <cnnrznn@udel.edu> a écrit :
>
>> Hello,
>>
>> I am having an issue when I run a word count job. I have included the
>> source and log files for reference. The job finishes successfully, but
>> about halfway through I get a java.lang.OutOfMemoryError (could not create
>> native thread), and this leads to the loss of the Executor. After some
>> searching I found out this was a problem with the environment and the limit
>> by the OS on how many threads I could spawn.
>>
>> However, I had thought that Spark only maintained a thread pool equal in
>> size to the number of cores available across the nodes (by default), and
>> schedules tasks dynamically as threads become available. The only Spark
>> parameter I change is the number of partitions in my RDD.
>>
>> My question is, how is Spark deciding how many threads to spawn and when?
>>
>> --
>> Regards,
>>
>> Connor Zanin
>> Computer Science
>> University of Delaware
>>
>>
>>
>> --
>> Regards,
>>
>> Connor Zanin
>> Computer Science
>> University of Delaware
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>
>


-- 
Regards,

Connor Zanin
Computer Science
University of Delaware

Mime
View raw message