spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Phadnis, Varun" <phad...@sky.optymyze.com>
Subject RE: Spark driver CPU usage
Date Wed, 01 Mar 2017 12:57:00 GMT
Does that configuration parameter affect the CPU usage of the driver? If it does, we have that
property unchanged from its default value of "1" yet the same behaviour as before.
 
-----Original Message-----
From: Rohit Verma [mailto:rohit.verma@rokittech.com] 
Sent: 01 March 2017 06:08
To: Phadnis, Varun <phadnis@sky.optymyze.com>
Cc: user@spark.apache.org
Subject: Re: Spark driver CPU usage

Use conf spark.task.cpus to control number of cpus to use in a task.

On Mar 1, 2017, at 5:41 PM, Phadnis, Varun <phadnis@sky.optymyze.com> wrote:
> 
> Hello,
>  
> Is there a way to control CPU usage for driver when running applications in client mode?
>  
> Currently we are observing that the driver occupies all the cores. Launching just 3 instances
of driver of WordCount sample application concurrently on the same machine brings the usage
of its 4 core CPU to 100%. Is this expected behaviour?
>  
> Thanks.


---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message