spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Xiao JIANG <jiangxia...@outlook.com>
Subject How to get total CPU consumption for Spark job
Date Fri, 07 Aug 2015 22:06:02 GMT
Hi all,
I was running some Hive/spark job on hadoop cluster.  I want to see how spark helps improve
not only the elapsed time but also the total CPU consumption.
For Hive, I can get the 'Total MapReduce CPU Time Spent' from the log when the job finishes.
But I didn't find any CPU stats for Spark jobs from either spark log or web UI. Is there any
place I can find the total CPU consumption for my spark job? Thanks!
Here is the version info: Spark version 1.3.0 Using Scala version 2.10.4, Java 1.7.0_67
Thanks!Xiao 		 	   		  
Mime
View raw message