spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sean Owen <sro...@gmail.com>
Subject Do we need to finally update Guava?
Date Sun, 15 Dec 2019 16:08:28 GMT
See for example:

https://github.com/apache/spark/pull/25932#issuecomment-565822573
https://issues.apache.org/jira/browse/SPARK-23897

This is a dicey dependency that we have been reluctant to update as a)
Hadoop used an old version and b) Guava versions are incompatible
after a few releases.

But Hadoop is going all the way from 11 to 27 in Hadoop 3.2.1. Time to
match that? I haven't assessed how much internal change it requires.
If it's a lot, well, that makes it hard, as we need to stay compatible
with Hadoop 2 / Guava 11-14. But then that causes a problem updating
past Hadoop 3.2.0.

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Mime
View raw message