spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jacek Laskowski <>
Subject Repeated FileSourceScanExec.metrics from ColumnarBatchScan.metrics
Date Tue, 22 May 2018 17:17:05 GMT

I'm wondering why are the metrics repeated in FileSourceScanExec.metrics
[1] since it is a ColumnarBatchScan [2] and so inherits the two
metrics numOutputRows and scanTime from ColumnarBatchScan.metrics [3].

Shouldn't FileSourceScanExec.metrics be as follows then:

  override lazy val metrics = super.metrics ++ Map(
    "numFiles" -> SQLMetrics.createMetric(sparkContext, "number of files"),
    "metadataTime" -> SQLMetrics.createMetric(sparkContext, "metadata time

I'd like to send a pull request with a fix if no one objects. Anyone?


Jacek Laskowski
Mastering Spark SQL
Spark Structured Streaming
Mastering Kafka Streams
Follow me at

View raw message