spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From abshkmodi <>
Subject Read/write metrics for jobs which use S3
Date Wed, 17 Jun 2015 06:00:12 GMT
I mostly use Amazon S3 for reading input data and writing output data for my
spark jobs. I want to know the numbers of bytes read & written by my job
from S3.

In hadoop, there are FileSystemCounters for this, is there something similar
in spark ? If there is, can you please guide me on how to use it ?

I saw there are some read/write metrics in TaskMetrics.scala. Is there a way
to get this by specifying a DataReadMethod in TaskMetrics.scala ? 

View this message in context:
Sent from the Apache Spark Developers List mailing list archive at

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message