mahout-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ted Dunning <ted.dunn...@gmail.com>
Subject Re: Theory behind LogisticRegression in Mahout
Date Thu, 22 May 2014 21:16:25 GMT

Correct link. 

Simulated annealing is still used, but this isn't a particularly good application of it. 

Sent from my iPhone

> On May 22, 2014, at 13:25, Dmitriy Lyubimov <dlieu.7@gmail.com> wrote:
> 
> i think it is actually a mix.
> 
> yes SGD, but there's also online validation of hyperparameters via
> step-recorded search. Hope my citation is correct, i started forgetting
> things. [1]
> 
> AFAIK simulated annealing approach has been abandoned in favor of [1]
> 
> [1]
> http://www.researchgate.net/publication/1916718_Recorded_Step_Directional_Mutation_for_Faster_Convergence
> 
> 
>> On Wed, May 21, 2014 at 11:44 PM, Peng Zhang <pzhang.xjtu@gmail.com> wrote:
>> 
>> Namit,
>> 
>> I think the theory behind Mahout’s logistic regression is stochastic
>> gradient descent, rather than maximum likelihood.
>> 
>> Best Regards,
>> Peng Zhang
>> 
>> 
>> 
>> On May 22, 2014, at 2:29 PM, namit maheshwari <namitmaheshwari7@gmail.com>
>> wrote:
>> 
>>> Hello Everyone,
>>> 
>>> Could anyone please let me know the algorithm used behind
>>> LogisticRegression in Mahout. Also AdaptiveLogisticRegression mentions an
>>> *annealing* schedule.
>>> 
>>> I would be grateful if someone could guide me towards the theory behind
>> it.
>>> 
>>> Thanks
>>> Namit
>> 
>> 

Mime
View raw message