spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Patrick Wendell <>
Subject Re: Spark 0.8.0-incubating RC2
Date Sat, 07 Sep 2013 18:24:09 GMT

Thanks a lot for your feedback.

Could you let me know how you ran Apache RAT tool so I can reproduce this?

My sense is that the best "next step" is to do a RC that is built
against the Apache Git and also includes both `src` and `bin` in
addition to cleaned up license files. Some inline responses below.

> 1. I only see source artifacts in Patrick's p.a.o URL. I assume the
> pre-built ones will also be published with hash and signed?

Yes, we'll do both src and binary releases. I'll hash, and sign both.

> 2. For every ASF release, we need designated release engineer (RE)
> that will drive the release process including determining bugs to be
> included, make sure all files have the right ASF header (running maven
> RAT plugin check), create release branch, update version for next
> development, create release artifacts and sign them correctly. I
> assume this would be Matei or Patrick?

Yes, this might be me for this release because I've got the keys
correctly set-up. I'll chat with Matei when he's back.

> 3. The proposed source artifacts 0.8.0-RC2's signature looks good and
> hash looks good. However it was generated against github mesos:spark
> repo.
>     Reminder that when we send proposal for release to
> general@incubator.a.o we need to generate RC builds using ASF git repo
> with the right tagged branch.

Next RC we will take care of this.

> 4. I ran RAT check for the source artifact and found a lot of source
> do not have ASF license header.
>  For example some in repl directory has this:
> /* NSC -- new Scala compiler
>  * Copyright 2005-2011 LAMP/EPFL
>  * @author Paul Phillips
>  */
> Not sure if we need to ASF header to it since we are technically put
> in under apache package.
> Scala source files under mllib are missing ASF headers.

See comment above.

> 5. Add public key of RE to
> (@Chris do we still need
> to create KEYS file in the Spark git repo?)

This is now finished for me :)

View raw message