spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Chetan Khatri <>
Subject Spark Cluster over yarn cluster monitoring
Date Thu, 17 Oct 2019 14:11:54 GMT
Hi Users,

I do submit *X* number of jobs with Airflow to Yarn as a part of workflow
for *Y *customer. I could potentially run workflow for customer *Z *but I
need to check that how much resources are available over the cluster so
jobs for next customer should start.

Could you please tell what is the best way to handle this. Currently, I am
just checking availableMB > 100 then trigger next Airflow DAG over Yarn.

GET http://rm-http-address:port/ws/v1/cluster/metrics


View raw message