It's getting strange now. If I ran from IDE, my executor memory is always set to 6.7G, no matter what value I set in code. I have check my environment variable, and there's no value of 6.7, or 12.5
Hi Xi Shen,You could set the spark.executor.memory in the code itself . new SparkConf()..set("spark.executor.memory",
Or you can try the -- spark.executor.memory 2g while submitting the jar.
By default spark.executor.memory is set to 512m, I'm assuming since you are submiting the job using spark-submit and it is not able to override the value since you are running in local mode. Can you try it without using spark-submit as a standalone project?
On Mon, Mar 16, 2015 at 1:52 PM, Xi Shen <firstname.lastname@example.org> wrote:
I set it in code, not by configuration. I submit my jar file to local. I am working in my developer environment.
On Mon, 16 Mar 2015 18:28 Akhil Das <email@example.com> wrote:
How are you setting it? and how are you submitting the job?
On Mon, Mar 16, 2015 at 12:52 PM, Xi Shen <firstname.lastname@example.org> wrote:
I have set spark.executor.memory to 2048m, and in the UI "Environment" page, I can see this value has been set correctly. But in the "Executors" page, I saw there's only 1 executor and its memory is 265.4MB. Very strange value. why not 256MB, or just as what I set?
What am I missing here?
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments. WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email. www.wipro.com