spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From JeffK <>
Subject Spark 2.4.4 with which version of Hadoop?
Date Wed, 11 Dec 2019 17:43:15 GMT

We've been considering using the download package Spark 2.4.4 that's
pre-built for Hadoop 2.7 with Hadoop 2.7.7.

When used with Spark, Hadoop 2.7 is often quoted as the most stable.

However, Hadoop 2.7.7 is End Of Life. The most recent Hadoop vulnerabilities
have only been fixed in versions 2.8.5 and above.

We've searched the Spark user forum and have also been following discussions
on the development forum and it's still unclear as which version of Hadoop
should be used. Discussions about Spark 3.0.0 currently want to leave Hadoop
2.7 as the default, when there are known vulnerabilities this is a concern.

What versions of Hadoop 2.X is supported, which should we be using?



Sent from:

To unsubscribe e-mail:

View raw message