Although I'm not sure how valuable Cygwin support it is, at least the release notes need mention that Cygwin is not supported by design from 1.4.0

From the description of the changeset, looks like remove the supporting is not intended by the author

Thanks
Proust




From:        Sachin Naik <sachin.u.naik@gmail.com>
To:        Sean Owen <sowen@cloudera.com>
Cc:        Steve Loughran <stevel@hortonworks.com>, Proust GZ Feng/China/IBM@IBMCN, "user@spark.apache.org" <user@spark.apache.org>
Date:        07/29/2015 05:05 AM
Subject:        Re: NO Cygwin Support in bin/spark-class in Spark 1.4.0




I agree with Sean - using virtual box on windows and using linux vm is a lot easier than trying to circumvent the cygwin oddities. a lot of functionality might not work in cygwin and you will end up trying to do back patches. Unless there is a compelling reason - cygwin support seems not required


@sachinnaik from iphone


On Jul 28, 2015, at 1:25 PM, Sean Owen <sowen@cloudera.com> wrote:

> That's for the Windows interpreter rather than bash-running Cygwin. I
> don't know it's worth doing a lot of legwork for Cygwin, but, if it's
> really just a few lines of classpath translation in one script, seems
> reasonable.
>
> On Tue, Jul 28, 2015 at 9:13 PM, Steve Loughran <stevel@hortonworks.com> wrote:
>>
>> there's a spark-submit.cmd file for windows. Does that work?
>>
>> On 27 Jul 2015, at 21:19, Proust GZ Feng <pfeng@cn.ibm.com> wrote:
>>
>> Hi, Spark Users
>>
>> Looks like Spark 1.4.0 cannot work with Cygwin due to the removing of Cygwin
>> support in bin/spark-class
>>
>> The changeset is
>>
https://github.com/apache/spark/commit/517975d89d40a77c7186f488547eed11f79c1e97#diff-fdf4d3e600042c63ffa17b692c4372a3
>>
>> The changeset said "Add a library for launching Spark jobs
>> programmatically", but how to use it in Cygwin?
>> I'm wondering any solutions available to make it work in Windows?
>>
>>
>> Thanks
>> Proust
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org