+1

Been testing cluster mode and client mode with mesos with 6 nodes cluster.

Everything works so far.

Tim

On Jun 4, 2015, at 5:47 PM, Andrew Or <andrew@databricks.com> wrote:

+1 (binding)

Ran the same tests I did for RC3:

Tested the standalone cluster mode REST submission gateway - submit / status / kill
Tested simple applications on YARN client / cluster modes with and without --jars
Tested python applications on YARN client / cluster modes with and without --py-files*
Tested dynamic allocation on YARN client / cluster modes**

All good. A couple of known issues:

*SPARK-8017: YARN cluster python --py-files not working - not a blocker (new feature)
** SPARK-8088: noisy output when min executors is set - not a blocker (output can be disabled)

2015-06-04 13:35 GMT-07:00 Matei Zaharia <matei.zaharia@gmail.com>:
+1

Tested on Mac OS X

> On Jun 4, 2015, at 1:09 PM, Patrick Wendell <pwendell@gmail.com> wrote:
>
> I will give +1 as well.
>
> On Wed, Jun 3, 2015 at 11:59 PM, Reynold Xin <rxin@databricks.com> wrote:
>> Let me give you the 1st
>>
>> +1
>>
>>
>>
>> On Tue, Jun 2, 2015 at 10:47 PM, Patrick Wendell <pwendell@gmail.com> wrote:
>>>
>>> He all - a tiny nit from the last e-mail. The tag is v1.4.0-rc4. The
>>> exact commit and all other information is correct. (thanks Shivaram
>>> who pointed this out).
>>>
>>> On Tue, Jun 2, 2015 at 8:53 PM, Patrick Wendell <pwendell@gmail.com>
>>> wrote:
>>>> Please vote on releasing the following candidate as Apache Spark version
>>>> 1.4.0!
>>>>
>>>> The tag to be voted on is v1.4.0-rc3 (commit 22596c5):
>>>> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
>>>> 22596c534a38cfdda91aef18aa9037ab101e4251
>>>>
>>>> The release files, including signatures, digests, etc. can be found at:
>>>> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-bin/
>>>>
>>>> Release artifacts are signed with the following key:
>>>> https://people.apache.org/keys/committer/pwendell.asc
>>>>
>>>> The staging repository for this release can be found at:
>>>> [published as version: 1.4.0]
>>>> https://repository.apache.org/content/repositories/orgapachespark-1111/
>>>> [published as version: 1.4.0-rc4]
>>>> https://repository.apache.org/content/repositories/orgapachespark-1112/
>>>>
>>>> The documentation corresponding to this release can be found at:
>>>> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-docs/
>>>>
>>>> Please vote on releasing this package as Apache Spark 1.4.0!
>>>>
>>>> The vote is open until Saturday, June 06, at 05:00 UTC and passes
>>>> if a majority of at least 3 +1 PMC votes are cast.
>>>>
>>>> [ ] +1 Release this package as Apache Spark 1.4.0
>>>> [ ] -1 Do not release this package because ...
>>>>
>>>> To learn more about Apache Spark, please see
>>>> http://spark.apache.org/
>>>>
>>>> == What has changed since RC3 ==
>>>> In addition to may smaller fixes, three blocker issues were fixed:
>>>> 4940630 [SPARK-8020] [SQL] Spark SQL conf in spark-defaults.conf make
>>>> metadataHive get constructed too early
>>>> 6b0f615 [SPARK-8038] [SQL] [PYSPARK] fix Column.when() and otherwise()
>>>> 78a6723 [SPARK-7978] [SQL] [PYSPARK] DecimalType should not be singleton
>>>>
>>>> == How can I help test this release? ==
>>>> If you are a Spark user, you can help us test this release by
>>>> taking a Spark 1.3 workload and running on this release candidate,
>>>> then reporting any regressions.
>>>>
>>>> == What justifies a -1 vote for this release? ==
>>>> This vote is happening towards the end of the 1.4 QA period,
>>>> so -1 votes should only occur for significant regressions from 1.3.1.
>>>> Bugs already present in 1.3.X, minor regressions, or bugs related
>>>> to new features will not block this release.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>> For additional commands, e-mail: dev-help@spark.apache.org
>>>
>>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org