spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akhil Das <ak...@sigmoidanalytics.com>
Subject Re: Does spark works on multicore systems?
Date Sun, 09 Nov 2014 17:15:20 GMT
Try adding the following entry inside your conf/spark-defaults.conf file

spark.cores.max 64

Thanks
Best Regards

On Sun, Nov 9, 2014 at 3:50 AM, Blind Faith <person.of.book@gmail.com>
wrote:

> I am a Spark newbie and I use python (pyspark). I am trying to run a
> program on a 64 core system, but no matter what I do, it always uses 1
> core. It doesn't matter if I run it using "spark-submit --master local[64]
> run.sh" or I call x.repartition(64) in my code with an RDD, the spark
> program always uses one core. Has anyone experience of running spark
> programs on multicore processors with success? Can someone provide me a
> very simple example that does properly run on all cores of a multicore
> system?
>

Mime
View raw message