spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hari Shreedharan" <hshreedha...@cloudera.com>
Subject Re: [VOTE] Release Apache Spark 1.3.1
Date Mon, 06 Apr 2015 18:43:09 GMT
Ah, ok. It was missing in the list of jiras. So +1.




Thanks, Hari

On Mon, Apr 6, 2015 at 11:36 AM, Patrick Wendell <pwendell@gmail.com>
wrote:

> I believe TD just forgot to set the fix version on the JIRA. There is
> a fix for this in 1.3:
> https://github.com/apache/spark/commit/03e263f5b527cf574f4ffcd5cd886f7723e3756e
> - Patrick
> On Mon, Apr 6, 2015 at 2:31 PM, Mark Hamstra <mark@clearstorydata.com> wrote:
>> Is that correct, or is the JIRA just out of sync, since TD's PR was merged?
>> https://github.com/apache/spark/pull/5008
>>
>> On Mon, Apr 6, 2015 at 11:10 AM, Hari Shreedharan
>> <hshreedharan@cloudera.com> wrote:
>>>
>>> It does not look like https://issues.apache.org/jira/browse/SPARK-6222
>>> made it. It was targeted towards this release.
>>>
>>>
>>>
>>>
>>> Thanks, Hari
>>>
>>> On Mon, Apr 6, 2015 at 11:04 AM, York, Brennon
>>> <Brennon.York@capitalone.com> wrote:
>>>
>>> > +1 (non-binding)
>>> > Tested GraphX, build infrastructure, & core test suite on OSX 10.9 w/
>>> > Java
>>> > 1.7/1.8
>>> > On 4/6/15, 5:21 AM, "Sean Owen" <sowen@cloudera.com> wrote:
>>> >>SPARK-6673 is not, in the end, relevant for 1.3.x I believe; we just
>>> >>resolved it for 1.4 anyway. False alarm there.
>>> >>
>>> >>I back-ported SPARK-6205 into the 1.3 branch for next time. We'll pick
>>> >>it up if there's another RC, but by itself is not something that needs
>>> >>a new RC. (I will give the same treatment to branch 1.2 if needed in
>>> >>light of the 1.2.2 release.)
>>> >>
>>> >>I applied the simple change in SPARK-6205 in order to continue
>>> >>executing tests and all was well. I still see a few failures in Hive
>>> >>tests:
>>> >>
>>> >>- show_create_table_serde *** FAILED ***
>>> >>- show_tblproperties *** FAILED ***
>>> >>- udf_std *** FAILED ***
>>> >>- udf_stddev *** FAILED ***
>>> >>
>>> >>with ...
>>> >>
>>> >>mvn -Phadoop-2.4 -Pyarn -Phive -Phive-0.13.1 -Dhadoop.version=2.6.0
>>> >>-DskipTests clean package; mvn -Phadoop-2.4 -Pyarn -Phive
>>> >>-Phive-0.13.1 -Dhadoop.version=2.6.0 test
>>> >>
>>> >>... but these are not regressions from 1.3.0.
>>> >>
>>> >>+1 from me at this point on the current artifacts.
>>> >>
>>> >>On Sun, Apr 5, 2015 at 9:24 AM, Sean Owen <sowen@cloudera.com>
wrote:
>>> >>> Signatures and hashes are good.
>>> >>> LICENSE, NOTICE still check out.
>>> >>> Compiles for a Hadoop 2.6 + YARN + Hive profile.
>>> >>>
>>> >>> I still see the UISeleniumSuite test failure observed in 1.3.0,
which
>>> >>> is minor and already fixed. I don't know why I didn't back-port
it:
>>> >>> https://issues.apache.org/jira/browse/SPARK-6205
>>> >>>
>>> >>> If we roll another, let's get this easy fix in, but it is only an
>>> >>> issue with tests.
>>> >>>
>>> >>>
>>> >>> On JIRA, I checked open issues with Fix Version = 1.3.0 or 1.3.1
and
>>> >>> all look legitimate (e.g. reopened or in progress)
>>> >>>
>>> >>>
>>> >>> There is 1 open Blocker for 1.3.1 per Andrew:
>>> >>> https://issues.apache.org/jira/browse/SPARK-6673 spark-shell.cmd
can't
>>> >>> start even when spark was built in Windows
>>> >>>
>>> >>> I believe this can be resolved quickly but as a matter of hygiene
>>> >>> should be fixed or demoted before release.
>>> >>>
>>> >>>
>>> >>> FYI there are 16 Critical issues marked for 1.3.0 / 1.3.1; worth
>>> >>> examining before release to see how critical they are:
>>> >>>
>>> >>> SPARK-6701,Flaky test: o.a.s.deploy.yarn.YarnClusterSuite Python
>>> >>> application,,Open,4/3/15
>>> >>> SPARK-6484,"Ganglia metrics xml reporter doesn't escape
>>> >>> correctly",Josh Rosen,Open,3/24/15
>>> >>> SPARK-6270,Standalone Master hangs when streaming job
>>> >>>completes,,Open,3/11/15
>>> >>> SPARK-6209,ExecutorClassLoader can leak connections after failing
to
>>> >>> load classes from the REPL class server,Josh Rosen,In Progress,4/2/15
>>> >>> SPARK-5113,Audit and document use of hostnames and IP addresses
in
>>> >>> Spark,,Open,3/24/15
>>> >>> SPARK-5098,Number of running tasks become negative after tasks
>>> >>> lost,,Open,1/14/15
>>> >>> SPARK-4925,Publish Spark SQL hive-thriftserver maven artifact,Patrick
>>> >>> Wendell,Reopened,3/23/15
>>> >>> SPARK-4922,Support dynamic allocation for coarse-grained
>>> >>>Mesos,,Open,3/31/15
>>> >>> SPARK-4888,"Spark EC2 doesn't mount local disks for i2.8xlarge
>>> >>> instances",,Open,1/27/15
>>> >>> SPARK-4879,Missing output partitions after job completes with
>>> >>> speculative execution,Josh Rosen,Open,3/5/15
>>> >>> SPARK-4751,Support dynamic allocation for standalone mode,Andrew
>>> >>> Or,Open,12/22/14
>>> >>> SPARK-4454,Race condition in DAGScheduler,Josh Rosen,Reopened,2/18/15
>>> >>> SPARK-4452,Shuffle data structures can starve others on the same
>>> >>> thread for memory,Tianshuo Deng,Open,1/24/15
>>> >>> SPARK-4352,Incorporate locality preferences in dynamic allocation
>>> >>> requests,,Open,1/26/15
>>> >>> SPARK-4227,Document external shuffle service,,Open,3/23/15
>>> >>> SPARK-3650,Triangle Count handles reverse edges
>>> >>>incorrectly,,Open,2/23/15
>>> >>>
>>> >>> On Sun, Apr 5, 2015 at 1:09 AM, Patrick Wendell <pwendell@gmail.com>
>>> >>>wrote:
>>> >>>> Please vote on releasing the following candidate as Apache Spark
>>> >>>>version 1.3.1!
>>> >>>>
>>> >>>> The tag to be voted on is v1.3.1-rc1 (commit 0dcb5d9f):
>>> >>>>
>>>
>>> >>>> >>>>https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=0dcb5d9f3
>>> >>>>1b713ed90bcec63ebc4e530cbb69851
>>> >>>>
>>> >>>> The list of fixes present in this release can be found at:
>>> >>>> http://bit.ly/1C2nVPY
>>> >>>>
>>> >>>> The release files, including signatures, digests, etc. can be
found
>>> >>>> at:
>>> >>>> http://people.apache.org/~pwendell/spark-1.3.1-rc1/
>>> >>>>
>>> >>>> Release artifacts are signed with the following key:
>>> >>>> https://people.apache.org/keys/committer/pwendell.asc
>>> >>>>
>>> >>>> The staging repository for this release can be found at:
>>> >>>>
>>> >>>> https://repository.apache.org/content/repositories/orgapachespark-1080
>>> >>>>
>>> >>>> The documentation corresponding to this release can be found
at:
>>> >>>> http://people.apache.org/~pwendell/spark-1.3.1-rc1-docs/
>>> >>>>
>>> >>>> Please vote on releasing this package as Apache Spark 1.3.1!
>>> >>>>
>>> >>>> The vote is open until Wednesday, April 08, at 01:10 UTC and
passes
>>> >>>> if a majority of at least 3 +1 PMC votes are cast.
>>> >>>>
>>> >>>> [ ] +1 Release this package as Apache Spark 1.3.1
>>> >>>> [ ] -1 Do not release this package because ...
>>> >>>>
>>> >>>> To learn more about Apache Spark, please see
>>> >>>> http://spark.apache.org/
>>> >>>>
>>> >>>> - Patrick
>>> >>>>
>>> >>>> ---------------------------------------------------------------------
>>> >>>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>> >>>> For additional commands, e-mail: dev-help@spark.apache.org
>>> >>>>
>>> >>
>>> >>---------------------------------------------------------------------
>>> >>To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>> >>For additional commands, e-mail: dev-help@spark.apache.org
>>> >>
>>> > ________________________________________________________
>>> > The information contained in this e-mail is confidential and/or
>>> > proprietary to Capital One and/or its affiliates. The information
>>> > transmitted herewith is intended only for use by the individual or entity
to
>>> > which it is addressed.  If the reader of this message is not the intended
>>> > recipient, you are hereby notified that any review, retransmission,
>>> > dissemination, distribution, copying or other use of, or taking of any
>>> > action in reliance upon this information is strictly prohibited. If you
have
>>> > received this communication in error, please contact the sender and delete
>>> > the material from your computer.
>>> > ---------------------------------------------------------------------
>>> > To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>> > For additional commands, e-mail: dev-help@spark.apache.org
>>
>>
Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message