samza-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jakob Homan <jgho...@gmail.com>
Subject Re: Do we want to provide different hadoop versions for 0.7.0 release?
Date Wed, 25 Jun 2014 20:18:24 GMT
Is Spark doing this by only supporting the intersection of available APIs,
or via some type of munging?  We did the latter in Giraph and it was a
nightmare...


On Wed, Jun 25, 2014 at 11:40 AM, Yan Fang <yanfang724@gmail.com> wrote:

> Hi guys,
>
> I am thinking of this because of Dotan's email (thanks, Dotan) . Currently
> people are using different versions of hadoop. They will definitely have
> problem if their hadoop server has different version from what Samza is
> complied. That hurts user experience, no matter he is a veteran or newbie.
>
> In Spark, they provide a way to configure different hadoop version during
> compiling in their latest and previous release:
> http://spark.apache.org/docs/latest/building-with-maven.html
> http://spark.apache.org/docs/0.9.0/
>
> Maybe we should consider this as an add-on for our 0.7.0 release too. Since
> we already are able to switch scala version, it should not have technical
> difficulty. The risk may come from Samza is not extensively tested in other
> hadoop versions. If the risk is the concern, at least, we can provide
> simple instruction to help user build the Samza with other hadoop versions.
>
> What do you think?
>
> Thanks,
>
> Fang, Yan
> yanfang724@gmail.com
> +1 (206) 849-4108
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message