spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akhil Das <ak...@hacked.work>
Subject Re: Spark not using all the cluster instances in AWS EMR
Date Sun, 19 Jun 2016 01:40:22 GMT
spark.executor.instances is the parameter that you are looking for. Read
more here http://spark.apache.org/docs/latest/running-on-yarn.html

On Sun, Jun 19, 2016 at 2:17 AM, Natu Lauchande <nlauchande@gmail.com>
wrote:

> Hi,
>
> I am running some spark loads . I notice that in  it only uses one of the
> machines(instead of the 3 available) of the cluster.
>
> Is there any parameter that can be set to force it to use all the cluster.
>
> I am using AWS EMR with Yarn.
>
>
> Thanks,
> Natu
>
>
>
>
>
>
>


-- 
Cheers!

Mime
View raw message