spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akhil Das <ak...@sigmoidanalytics.com>
Subject Re: Resource usage of a spark application
Date Sun, 17 May 2015 15:10:34 GMT
You can either pull the high level information from your resource manager,
or if you want more control/specific information you can write a script and
pull the resource usage information from the OS. Something like this
<http://www.itsprite.com/linux3-shell-scripts-to-monitor-the-process-resource-in-linux/>
will help.

Thanks
Best Regards

On Sun, May 17, 2015 at 6:18 PM, Peter Prettenhofer <
peter.prettenhofer@gmail.com> wrote:

> Hi all,
>
> I'm looking for a way to measure the current memory / cpu usage of a spark
> application to provide users feedback how much resources are actually being
> used.
> It seems that the metric system provides this information to some extend.
> It logs metrics on application level (nr of cores granted) and on the JVM
> level (memory usage).
> Is this the recommended way to gather this kind of information? If so, how
> do i best map a spark application to the corresponding JVM processes?
>
> If not, should i rather request this information from the resource manager
> (e.g. Mesos/YARN)?
>
> thanks,
>  Peter
>
> --
> Peter Prettenhofer
>

Mime
View raw message