spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ted Yu <>
Subject Re: SPARK_WORKER_INSTANCES deprecated
Date Mon, 01 Feb 2016 22:44:44 GMT
As the message (from SparkConf.scala) showed, you shouldn't


On Mon, Feb 1, 2016 at 2:19 PM, Lin, Hao <> wrote:

> Can I still use SPARK_WORKER_INSTANCES in conf/  the
> following is what I’ve got after trying to set this parameter and run
> spark-shell
> SPARK_WORKER_INSTANCES was detected (set to '32').
> This is deprecated in Spark 1.0+.
> Please instead use:
> - ./spark-submit with --num-executors to specify the number of executors
> - spark.executor.instances to configure the number of instances in the
> spark config.
> Confidentiality Notice:: This email, including attachments, may include
> non-public, proprietary, confidential or legally privileged information. If
> you are not an intended recipient or an authorized agent of an intended
> recipient, you are hereby notified that any dissemination, distribution or
> copying of the information contained in or transmitted with this e-mail is
> unauthorized and strictly prohibited. If you have received this email in
> error, please notify the sender by replying to this message and permanently
> delete this e-mail, its attachments, and any copies of it immediately. You
> should not retain, copy or use this e-mail or any attachment for any
> purpose, nor disclose all or any part of the contents to any other person.
> Thank you.

View raw message