On Tue, Jan 8, 2013 at 6:41 PM, Sean Owen <srowen@gmail.com> wrote:
> There's definitely a QR decomposition in there for me since solving A
> = X Y' for X is X = A Y (Y' * Y)^1 and you need some means to
> compute the inverse of that (small) matrix.
>
>
Sean,
I think I got it.
1) A Y is a handful of sparse matrixvector products,
2) Y' Y is a dense matrixmatrix on a "flat" matrix and a "tall" matrix,
producing a small square matrix,
3) inverting that matrix is not a big deal, since it is small.
Great!
Thanks!
It just was not immediately obvious to me at first look.
Now, the transition from ratings to 1s and 0s,
is this simply to handle implicit feedback,
or is this for some other reason?
> On Tue, Jan 8, 2013 at 5:27 PM, Ted Dunning <ted.dunning@gmail.com> wrote:
> > This particular part of the algorithm can be seen as similar to a least
> > squares problem that might normally be solved by QR. I don't think that
> > the updates are quite the same, however.
> >
> > On Tue, Jan 8, 2013 at 3:10 PM, Sebastian Schelter <ssc@apache.org>
> wrote:
> >
> >> This factorization is iteratively refined. In each iteration, ALS first
> >> fixes the itemfeature vectors and solves a leastsquares problem for
> >> each user and then fixes the userfeature vectors and solves a
> >> leastsquares problem for each item.
> >>
>
