spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sai Prasanna <ansaiprasa...@gmail.com>
Subject Re: persist @ disk-only failing
Date Mon, 19 May 2014 17:04:31 GMT
Ok Thanks!


On Mon, May 19, 2014 at 10:09 PM, Matei Zaharia <matei.zaharia@gmail.com>wrote:

> This is the patch for it: https://github.com/apache/spark/pull/50/. It
> might be possible to backport it to 0.8.
>
> Matei
>
> On May 19, 2014, at 2:04 AM, Sai Prasanna <ansaiprasanna@gmail.com> wrote:
>
> Matei, I am using 0.8.1 !!
>
> But is there a way without moving to 0.9.1 to bypass cache ?
>
>
> On Mon, May 19, 2014 at 1:31 PM, Matei Zaharia <matei.zaharia@gmail.com>wrote:
>
>> What version is this with? We used to build each partition first before
>> writing it out, but this was fixed a while back (0.9.1, but it may also be
>> in 0.9.0).
>>
>> Matei
>>
>> On May 19, 2014, at 12:41 AM, Sai Prasanna <ansaiprasanna@gmail.com>
>> wrote:
>>
>> > Hi all,
>> >
>> > When i gave the persist level as DISK_ONLY, still Spark tries to use
>> memory and caches.
>> > Any reason ?
>> > Do i need to override some parameter elsewhere ?
>> >
>> > Thanks !
>>
>>
>
>

Mime
View raw message