I was wrong here.

I am using spark standalone cluster and I am not using YARN or MESOS. Is it possible to track spark execution memory?.

On Mon, Oct 21, 2019 at 5:42 PM Sriram Ganesh <srignsh22@gmail.com> wrote:
I looked into this. But I found it is possible like this
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/status/AppStatusListener.scala#L229

Line no 230. This is for executors.

Just wanna cross verify is that right?



On Mon, 21 Oct 2019, 17:24 Alonso Isidoro Roman, <alonsoir@gmail.com> wrote:
Take a look in this thread

El lun., 21 oct. 2019 a las 13:45, Sriram Ganesh (<srignsh22@gmail.com>) escribió:
Hi,

I wanna monitor how much memory executor and task used for a given job. Is there any direct method available for it which can be used to track this metric?

--
Sriram G
Tech



--


--
Sriram G
Tech