spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Josh Rosen <>
Subject Re: [VOTE] Release Apache Spark 1.2.0 (RC1)
Date Mon, 01 Dec 2014 19:18:16 GMT
Hi everyone,

There’s an open bug report related to Spark standalone which could be a potential release-blocker
(pending investigation / a bug fix):  This
issue seems non-deterministc and only affects long-running Spark standalone deployments, so
it may be hard to reproduce.  I’m going to work on a patch to add additional logging in
order to help with debugging.

I just wanted to give an early head’s up about this issue and to get more eyes on it in
case anyone else has run into it or wants to help with debugging.

- Josh

On November 28, 2014 at 9:18:09 PM, Patrick Wendell ( wrote:

Please vote on releasing the following candidate as Apache Spark version 1.2.0!  

The tag to be voted on is v1.2.0-rc1 (commit 1056e9ec1):;a=commit;h=1056e9ec13203d0c51564265e94d77a054498fdb

The release files, including signatures, digests, etc. can be found at:  

Release artifacts are signed with the following key:  

The staging repository for this release can be found at:  

The documentation corresponding to this release can be found at:  

Please vote on releasing this package as Apache Spark 1.2.0!  

The vote is open until Tuesday, December 02, at 05:15 UTC and passes  
if a majority of at least 3 +1 PMC votes are cast.  

[ ] +1 Release this package as Apache Spark 1.1.0  
[ ] -1 Do not release this package because ...  

To learn more about Apache Spark, please see  

== What justifies a -1 vote for this release? ==  
This vote is happening very late into the QA period compared with  
previous votes, so -1 votes should only occur for significant  
regressions from 1.0.2. Bugs already present in 1.1.X, minor  
regressions, or bugs related to new features will not block this  

== What default changes should I be aware of? ==  
1. The default value of "spark.shuffle.blockTransferService" has been  
changed to "netty"  
--> Old behavior can be restored by switching to "nio"  

2. The default value of "spark.shuffle.manager" has been changed to "sort".  
--> Old behavior can be restored by setting "spark.shuffle.manager" to "hash".  

== Other notes ==  
Because this vote is occurring over a weekend, I will likely extend  
the vote if this RC survives until the end of the vote period.  

- Patrick  

To unsubscribe, e-mail:  
For additional commands, e-mail:  

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message