spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sun Rui <sunrise_...@163.com>
Subject Re: how to run local[k] threads on a single core
Date Thu, 04 Aug 2016 06:51:19 GMT
I don’t think it possible as Spark does not support thread to CPU affinity.
> On Aug 4, 2016, at 14:27, sujeet jog <sujeet.jog@gmail.com> wrote:
> 
> Is there a way we can run multiple tasks concurrently on a single core in local mode.
> 
> for ex :- i have 5 partition ~ 5 tasks, and only a single core , i want these tasks to
run concurrently, and specifiy them to use /run on a single core. 
> 
> The machine itself is say 4 core, but i want to utilize only 1 core out of it,. 
> 
> Is it possible ?
> 
> Thanks, 
> Sujeet
> 



---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message