spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Daniel Darabos <daniel.dara...@lynxanalytics.com>
Subject Re: how to run local[k] threads on a single core
Date Thu, 04 Aug 2016 14:41:28 GMT
You could run the application in a Docker container constrained to one CPU
with --cpuset-cpus (
https://docs.docker.com/engine/reference/run/#/cpuset-constraint).

On Thu, Aug 4, 2016 at 8:51 AM, Sun Rui <sunrise_win@163.com> wrote:

> I don’t think it possible as Spark does not support thread to CPU affinity.
> > On Aug 4, 2016, at 14:27, sujeet jog <sujeet.jog@gmail.com> wrote:
> >
> > Is there a way we can run multiple tasks concurrently on a single core
> in local mode.
> >
> > for ex :- i have 5 partition ~ 5 tasks, and only a single core , i want
> these tasks to run concurrently, and specifiy them to use /run on a single
> core.
> >
> > The machine itself is say 4 core, but i want to utilize only 1 core out
> of it,.
> >
> > Is it possible ?
> >
> > Thanks,
> > Sujeet
> >
>
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
>

Mime
View raw message