spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From meetwes <>
Subject Re: Best way to emit custom metrics to Prometheus in spark structured streaming
Date Wed, 04 Nov 2020 05:18:24 GMT
So I tried it again in standalone mode (spark-shell) and the df.observe()
functionality works. I tried sum, count, conditional aggregations using
'when', etc and all of this works in spark-shell. But, with spark-on-k8s,
cluster mode, only using lit() as the aggregation column works. No other
aggregation, including, count, sum, etc work.

Sent from:

To unsubscribe e-mail:

View raw message