mahout-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Tamas Jambor <jambo...@gmail.com>
Subject Re: out for memory
Date Fri, 18 Jun 2010 13:10:49 GMT
it's the standard netflix dataset around 100m ratings. It seems it is 
working OK with around 10gb memory. if I remember correctly this could 
be run using around 3-4gb.


On 18/06/2010 13:49, Sean Owen wrote:
> How big is your input?
> I am not sure this necessarily scales to tens of millions of data points, no.
> A distributed implementation is being created which could be more appropriate.
>
> On Fri, Jun 18, 2010 at 11:33 AM, Tamas Jambor<jamborta@gmail.com>  wrote:
>    
>> hi,
>>
>> i am trying to run an SVD recommender with the netflix dataset, first I am
>> trying to read the data to a GenericDataModel object, for some reason I
>> always run out of memory, even if I assign quite a lot of memory to the task
>> (e.g 10gb). is there a more efficient way to run this task?
>>
>> thanks,
>> Tamas
>>
>>      

Mime
View raw message