Yes, that's a key concern about the Java dependency, that its update is a function of the OS packages and those who control them, which is often not the end user. I think that's why this has been delayed a while. My general position is that, of course, someone in that boat can use Spark 2.1.x. It's likely going to see maintenance releases through the end of the year, even. On the flip side, no (non-paid) support has been available for Java 7 for a while. It wouldn't surprise me if some people are yet still stuck on Java 7; it would surprise me if they expect to use the latest of any package at this stage. Taking your CDH example, yes it's been a couple years since people have been able to deploy it on Java 8. Spark 2 isn't supported before 5.7 anyway. The default is Java 8.
Scala 2.10 is a good point that we are dealing with now. It's not really a question of whether it will run -- it's all libraries and bytecode to the JVM and it will happily deal with a mix of 7 and 8 bytecode. It's a question of whether the build for 2.10 will succeed. I believe it's 'yes' but am following up on some tests there.