spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Guoqiang Li" <wi...@qq.com>
Subject Re: [VOTE] Release Apache Spark 1.4.0 (RC3)
Date Mon, 01 Jun 2015 02:04:44 GMT
+1 (non-binding)




------------------ Original ------------------
From:  "Sandy Ryza";<sandy.ryza@cloudera.com>;
Date:  Mon, Jun 1, 2015 07:34 AM
To:  "Krishna Sankar"<ksankar42@gmail.com>; 
Cc:  "Patrick Wendell"<pwendell@gmail.com>; "dev@spark.apache.org"<dev@spark.apache.org>;

Subject:  Re: [VOTE] Release Apache Spark 1.4.0 (RC3)



+1 (non-binding)

Launched against a pseudo-distributed YARN cluster running Hadoop 2.6.0 and ran some jobs.


-Sandy


On Sat, May 30, 2015 at 3:44 PM, Krishna Sankar <ksankar42@gmail.com> wrote:
+1 (non-binding, of course)


1. Compiled OSX 10.10 (Yosemite) OK Total time: 17:07 min
     mvn clean package -Pyarn -Dyarn.version=2.6.0 -Phadoop-2.4 -Dhadoop.version=2.6.0 -DskipTests
2. Tested pyspark, mlib - running as well as compare results with 1.3.1
2.1. statistics (min,max,mean,Pearson,Spearman) OK
2.2. Linear/Ridge/Laso Regression OK 
2.3. Decision Tree, Naive Bayes OK
2.4. KMeans OK
       Center And Scale OK
2.5. RDD operations OK
      State of the Union Texts - MapReduce, Filter,sortByKey (word count)
2.6. Recommendation (Movielens medium dataset ~1 M ratings) OK
       Model evaluation/optimization (rank, numIter, lambda) with itertools OK
3. Scala - MLlib
3.1. statistics (min,max,mean,Pearson,Spearman) OK
3.2. LinearRegressionWithSGD OK
3.3. Decision Tree OK
3.4. KMeans OK
3.5. Recommendation (Movielens medium dataset ~1 M ratings) OK
3.6. saveAsParquetFile OK
3.7. Read and verify the 4.3 save(above) - sqlContext.parquetFile, registerTempTable, sql
OK
3.8. result = sqlContext.sql("SELECT OrderDetails.OrderID,ShipCountry,UnitPrice,Qty,Discount
FROM Orders INNER JOIN OrderDetails ON Orders.OrderID = OrderDetails.OrderID") OK
4.0. Spark SQL from Python OK
4.1. result = sqlContext.sql("SELECT * from people WHERE State = 'WA'") OK


Cheers
<k/>


On Fri, May 29, 2015 at 4:40 PM, Patrick Wendell <pwendell@gmail.com> wrote:
Please vote on releasing the following candidate as Apache Spark version 1.4.0!
 
 The tag to be voted on is v1.4.0-rc3 (commit dd109a8):
 https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=dd109a8746ec07c7c83995890fc2c0cd7a693730
 
 The release files, including signatures, digests, etc. can be found at:
 http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc3-bin/
 
 Release artifacts are signed with the following key:
 https://people.apache.org/keys/committer/pwendell.asc
 
 The staging repository for this release can be found at:
 [published as version: 1.4.0]
 https://repository.apache.org/content/repositories/orgapachespark-1109/
 [published as version: 1.4.0-rc3]
 https://repository.apache.org/content/repositories/orgapachespark-1110/
 
 The documentation corresponding to this release can be found at:
 http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc3-docs/
 
 Please vote on releasing this package as Apache Spark 1.4.0!
 
 The vote is open until Tuesday, June 02, at 00:32 UTC and passes
 if a majority of at least 3 +1 PMC votes are cast.
 
 [ ] +1 Release this package as Apache Spark 1.4.0
 [ ] -1 Do not release this package because ...
 
 To learn more about Apache Spark, please see
 http://spark.apache.org/
 
 == What has changed since RC1 ==
 Below is a list of bug fixes that went into this RC:
 http://s.apache.org/vN
 
 == How can I help test this release? ==
 If you are a Spark user, you can help us test this release by
 taking a Spark 1.3 workload and running on this release candidate,
 then reporting any regressions.
 
 == What justifies a -1 vote for this release? ==
 This vote is happening towards the end of the 1.4 QA period,
 so -1 votes should only occur for significant regressions from 1.3.1.
 Bugs already present in 1.3.X, minor regressions, or bugs related
 to new features will not block this release.
 
 ---------------------------------------------------------------------
 To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
 For additional commands, e-mail: dev-help@spark.apache.org
Mime
View raw message