spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Yong Zhang <java8...@hotmail.com>
Subject Re: Spark driver CPU usage
Date Wed, 01 Mar 2017 13:24:24 GMT
It won't control the cpu usage of Driver.


You should check out what CPUs are doing on your driver side. But I just want to make sure
that you do know the full CPU usage on a 4 cores Linux box will be 400%. So 100% really just
make one core busy.


Driver does maintain the application web UI, and track all kinds of tasks statistics. So even
if just a word count program, but if the source is huge, and generating thousands of tasks,
then driver will be busy.


Yong


________________________________
From: Phadnis, Varun <phadnis@sky.optymyze.com>
Sent: Wednesday, March 1, 2017 7:57 AM
To: user@spark.apache.org
Subject: RE: Spark driver CPU usage

Does that configuration parameter affect the CPU usage of the driver? If it does, we have
that property unchanged from its default value of "1" yet the same behaviour as before.

-----Original Message-----
From: Rohit Verma [mailto:rohit.verma@rokittech.com]
Sent: 01 March 2017 06:08
To: Phadnis, Varun <phadnis@sky.optymyze.com>
Cc: user@spark.apache.org
Subject: Re: Spark driver CPU usage

Use conf spark.task.cpus to control number of cpus to use in a task.

On Mar 1, 2017, at 5:41 PM, Phadnis, Varun <phadnis@sky.optymyze.com> wrote:
>
> Hello,
>
> Is there a way to control CPU usage for driver when running applications in client mode?
>
> Currently we are observing that the driver occupies all the cores. Launching just 3 instances
of driver of WordCount sample application concurrently on the same machine brings the usage
of its 4 core CPU to 100%. Is this expected behaviour?
>
> Thanks.


---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message