spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Marek Wiewiorka <marek.wiewio...@gmail.com>
Subject ---cores option in spark-shell
Date Tue, 03 Jun 2014 15:15:01 GMT
Hi All,
there is information in 1.0.0 Spark's documentation that
there is an option "--cores" that one can use to set the number of cores
that spark-shell uses on the cluster:

You can also pass an option --cores <numCores> to control the number of
cores that spark-shell uses on the cluster.

This option does not seem to work for me.
If run the following command:
./spark-shell --cores 12
I'm keep on getting an error:
bad option: '--cores'

Is there any other way of controlling the total number of cores used by
sparkshell?

Thanks,
Marek

Mime
View raw message