Enrico - thanks for sharing your experience.
I recently got a couple of PRs merged and my experience was different. I got lots of feedback from several maintainers (thank you very much!).
Can't speak to your PRs specifically, but can give the general advice that pivoting code based on maintainer feedback is probably the easiest way to get stuff merged.
I initially added an add_hours function to org.apache.spark.sql.functions and it seemed pretty clear that the maintainers weren't the biggest fans and were more in favor of a make_interval function. I proactively closed my own add_hours PR and pushed forward make_interval instead.
In hindsight, add_hours would have been a bad addition to the API and I'm glad it got rejected. For big, mature projects like Spark, it's more important for maintainers to reject stuff than add new functionality. Software bloat is the main risk for Spark.
I'm of the opinion that the auto-closing PR feature is working well. Spark maintainers have a difficult job of having to say "no" and disappoint people a lot. Auto closing is a great way to indirectly communicate the "no" in a way that's more psychologically palatable for both the maintainer and the committer.