spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From abshkmodi <abshkm...@gmail.com>
Subject Read/write metrics for jobs which use S3
Date Wed, 17 Jun 2015 06:00:12 GMT
I mostly use Amazon S3 for reading input data and writing output data for my
spark jobs. I want to know the numbers of bytes read & written by my job
from S3.

In hadoop, there are FileSystemCounters for this, is there something similar
in spark ? If there is, can you please guide me on how to use it ?

I saw there are some read/write metrics in TaskMetrics.scala. Is there a way
to get this by specifying a DataReadMethod in TaskMetrics.scala ? 



--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/Read-write-metrics-for-jobs-which-use-S3-tp12766.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Mime
View raw message