spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Patrick Wendell <pwend...@gmail.com>
Subject Re: State of spark on scala 2.10
Date Sun, 05 Jan 2014 05:03:45 GMT
I meant you'll need to build your own version of Spark. Typically we
do this by launching an existing AMI and then just building a new
version of spark and copying it to the slaves....

- Patrick

On Sat, Jan 4, 2014 at 8:44 PM, Patrick Wendell <pwendell@gmail.com> wrote:
> You'll have to build your own. Also there are some packaging
> differences in master (some bin/ scripts moved to sbin/) just to give
> you a heads up.
>
> On Sat, Jan 4, 2014 at 8:14 PM, Aureliano Buendia <buendia360@gmail.com> wrote:
>> Good to know the next release will be on scala 2.10.
>>
>> Meanwhile, does using the master branch mean that I will have to build my
>> own AMI when launching an ec2 cluster? Also, is there a nightly binary build
>> maven repo available for spark?
>>
>>
>> On Sun, Jan 5, 2014 at 3:56 AM, Aaron Davidson <ilikerps@gmail.com> wrote:
>>>
>>> Scala 2.10.3 support was recently merged into master (#259). The branch is
>>> probably not as stable as 0.8.1, but things "should" work.
>>>
>>> The 2.10 branch should be deleted, the only issue is there are some
>>> outstanding PRs against that branch that haven't been moved to master.
>>>
>>>
>>> On Sat, Jan 4, 2014 at 7:11 PM, Aureliano Buendia <buendia360@gmail.com>
>>> wrote:
>>>>
>>>> Hi,
>>>>
>>>> I was going to give https://github.com/scala/pickling a try on spark to
>>>> see how it would compare with kryo. Unfortunately, it only works with scala
>>>> 2.10.3.
>>>>
>>>> - Is there a time line for spark to work with scala 2.10?
>>>>
>>>> - Is the 2.10 branch as stable as 2.9?
>>>>
>>>> - What's blocking spark to work with 2.10?
>>>
>>>
>>

Mime
View raw message