mahout-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Koobas <koo...@gmail.com>
Subject Re: alternating least squares
Date Wed, 09 Jan 2013 00:25:46 GMT
On Tue, Jan 8, 2013 at 7:18 PM, Ted Dunning <ted.dunning@gmail.com> wrote:

> But is it actually QR of Y?
>
>
Ted,
This is my understanding:
In the process of solving the least squares problem,
you end up inverting a small square matrix (Y' * Y)-1.
How it is done is irrelevant.
Since the matrix is square, one could do LU factorization, a.k.a. Gaussian
elimination.
However, since we are talking here about solving an 100x100 problem,
one might as well do it with QR factorization which, unlike LU, is stable
"no matter what".



> On Tue, Jan 8, 2013 at 3:41 PM, Sean Owen <srowen@gmail.com> wrote:
>
> > There's definitely a QR decomposition in there for me since solving A
> > = X Y' for X  is  X = A Y (Y' * Y)^-1  and you need some means to
> > compute the inverse of that (small) matrix.
> >
> > On Tue, Jan 8, 2013 at 5:27 PM, Ted Dunning <ted.dunning@gmail.com>
> wrote:
> > > This particular part of the algorithm can be seen as similar to a least
> > > squares problem that might normally be solved by QR.  I don't think
> that
> > > the updates are quite the same, however.
> > >
> > > On Tue, Jan 8, 2013 at 3:10 PM, Sebastian Schelter <ssc@apache.org>
> > wrote:
> > >
> > >> This factorization is iteratively refined. In each iteration, ALS
> first
> > >> fixes the item-feature vectors and solves a least-squares problem for
> > >> each user and then fixes the user-feature vectors and solves a
> > >> least-squares problem for each item.
> > >>
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message