spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Yu Wei <yu20...@hotmail.com>
Subject Re: A question about Spark Cluster vs Local Mode
Date Thu, 28 Jul 2016 03:39:16 GMT
If cluster runs out of memory, it seems that the executor will be restarted by cluster manager.


Jared, (韦煜)
Software developer
Interested in open source software, big data, Linux

________________________________
From: Ascot Moss <ascot.moss@gmail.com>
Sent: Thursday, July 28, 2016 9:48:13 AM
To: user @spark
Subject: A question about Spark Cluster vs Local Mode

Hi,

If I submit the same job to spark in cluster mode, does it mean in cluster mode it will be
run in cluster memory pool and it will fail if it runs out of cluster's memory?


--driver-memory 64g \

--executor-memory 16g \

Regards
Mime
View raw message