spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Steve Loughran <ste...@cloudera.com.INVALID>
Subject Re: Use Hadoop-3.2 as a default Hadoop profile in 3.0.0?
Date Mon, 04 Nov 2019 15:06:32 GMT
On Mon, Nov 4, 2019 at 12:39 AM Nicholas Chammas <nicholas.chammas@gmail.com>
wrote:

> On Fri, Nov 1, 2019 at 8:41 AM Steve Loughran <stevel@cloudera.com.invalid>
> wrote:
>
>> It would be really good if the spark distributions shipped with later
>> versions of the hadoop artifacts.
>>
>
> I second this. If we need to keep a Hadoop 2.x profile around, why not
> make it Hadoop 2.8 or something newer?
>

go for 2.9

>
> Koert Kuipers <koert@tresata.com> wrote:
>
>> given that latest hdp 2.x is still hadoop 2.7 bumping hadoop 2 profile to
>> latest would probably be an issue for us.
>
>
> When was the last time HDP 2.x bumped their minor version of Hadoop? Do we
> want to wait for them to bump to Hadoop 2.8 before we do the same?
>

The internal builds of CDH and HDP are not those of ASF 2.7.x. A really
large proportion of the later branch-2 patches are backported. 2,7 was left
behind a long time ago

Mime
View raw message