spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From raman gugnani <ramangugnani....@gmail.com>
Subject Re: Monitor Spark Applications
Date Fri, 13 Sep 2019 05:45:54 GMT
Hi Alex,

Thanks will check this out.

Can it be done directly as spark also exposes the  metrics  or JVM. In this
my one doubt is how to assign fixed JMX ports to driver and executors.

@Alex,
Is there any difference in fetching data via JMX or using banzaicloud jar.


On Fri, 13 Sep 2019 at 10:47, Alex Landa <landa.alex86@gmail.com> wrote:

> Hi,
> We are starting to use https://github.com/banzaicloud/spark-metrics .
> Keep in mind that their solution is for Spark for K8s, to make it work for
> Spark on Yarn you have to copy the dependencies of the spark-metrics into
> Spark Jars folders on all the Spark machines (took me a while to figure).
>
> Thanks,
> Alex
>
> On Fri, Sep 13, 2019 at 7:58 AM raman gugnani <ramangugnani.007@gmail.com>
> wrote:
>
>> Hi Team,
>>
>> I am new to spark. I am using spark on hortonworks dataplatform with
>> amazon EC2 machines. I am running spark in cluster mode with yarn.
>>
>> I need to monitor individual JVMs and other Spark metrics with
>> *prometheus*.
>>
>> Can anyone suggest the solution to do the same.
>>
>> --
>> Raman Gugnani
>>
>

-- 
Raman Gugnani

Mime
View raw message