spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Xiang Huo <>
Subject How set how many cpu can be used by spark
Date Mon, 23 Sep 2013 04:48:56 GMT
Hi all,

I am trying to run a spark program on a server. It is not a cluster but
only a server. I want to configure my spark program can use at most 20 CPU,
because this machine is also shared by other users.

I know I can set local[K] as the value of Master URLs to limited how many
worker threads in this program. But after I run my program, there is only
at least two CPUs used. And the program will be run a long time if there is
only one or two cpus used.

Does any one have met similar situation or have any suggestion?


Xiang Huo
Department of Computer Science
University of Illinois at Chicago(UIC)
Chicago, Illinois

View raw message