spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Lin, Hao" <>
Date Mon, 01 Feb 2016 22:19:08 GMT
Can I still use SPARK_WORKER_INSTANCES in conf/  the following is what I’ve
got after trying to set this parameter and run spark-shell

SPARK_WORKER_INSTANCES was detected (set to '32').
This is deprecated in Spark 1.0+.

Please instead use:
- ./spark-submit with --num-executors to specify the number of executors
- spark.executor.instances to configure the number of instances in the spark config.

Confidentiality Notice::  This email, including attachments, may include non-public, proprietary,
confidential or legally privileged information.  If you are not an intended recipient or an
authorized agent of an intended recipient, you are hereby notified that any dissemination,
distribution or copying of the information contained in or transmitted with this e-mail is
unauthorized and strictly prohibited.  If you have received this email in error, please notify
the sender by replying to this message and permanently delete this e-mail, its attachments,
and any copies of it immediately.  You should not retain, copy or use this e-mail or any attachment
for any purpose, nor disclose all or any part of the contents to any other person. Thank you.
View raw message