mahout-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sean Owen <>
Subject Re: Using SVD-conditioned matrix
Date Sun, 16 Sep 2012 09:34:21 GMT
This is the same discussion that's been going on here about "fold-in".

If the decomposition is A ~= Ak = Uk * Sk  * Vk', then you can get an
expression for just Uk by multiplying on the right by right-inverses.
You want to take off V', and "half" of S, meaning its square root. So
we're really working with Ak ~= (Uk * sqrt(Sk)) * (sqrt(Sk) * Vk')

The right-inverse of Vk' is Vk, since it's orthonormal. The inverse of
a diagonal matrix is just the diagonal matrix of its reciprocals. Call
the inverse of sqrt(S) 1/sqrt(S)

So Uk * sqrt(Sk) = Ak * Vk * 1/sqrt(Sk)

This is how you project a row of Ak. Something entirely similar goes
for columns:

sqrt(Sk) * Vk' = 1/sqrt(Sk) * Uk' * Ak


On Sun, Sep 16, 2012 at 4:33 AM, Lance Norskog <> wrote:
> If you condition a vector set with the zero-the-small-singular-values
> trick, how do you project a vector from original space to the
> conditioned space? This would let you "condition" new data from a
> homogeneous dataset.
> It would be useful in the Mahout context. For example, with SSVD you
> can use the technique to get better vector clustering. You can create
> the "conditioning projection" from a sampled dataset, instead of
> decomposing and recomposing the whole dataset.
> Also asked on stack overflow:
> --
> Lance Norskog

View raw message