This is the same discussion that's been going on here about "foldin".
If the decomposition is A ~= Ak = Uk * Sk * Vk', then you can get an
expression for just Uk by multiplying on the right by rightinverses.
You want to take off V', and "half" of S, meaning its square root. So
we're really working with Ak ~= (Uk * sqrt(Sk)) * (sqrt(Sk) * Vk')
The rightinverse of Vk' is Vk, since it's orthonormal. The inverse of
a diagonal matrix is just the diagonal matrix of its reciprocals. Call
the inverse of sqrt(S) 1/sqrt(S)
So Uk * sqrt(Sk) = Ak * Vk * 1/sqrt(Sk)
This is how you project a row of Ak. Something entirely similar goes
for columns:
sqrt(Sk) * Vk' = 1/sqrt(Sk) * Uk' * Ak
Sean
On Sun, Sep 16, 2012 at 4:33 AM, Lance Norskog <goksron@gmail.com> wrote:
> If you condition a vector set with the zerothesmallsingularvalues
> trick, how do you project a vector from original space to the
> conditioned space? This would let you "condition" new data from a
> homogeneous dataset.
>
> It would be useful in the Mahout context. For example, with SSVD you
> can use the technique to get better vector clustering. You can create
> the "conditioning projection" from a sampled dataset, instead of
> decomposing and recomposing the whole dataset.
>
> Also asked on stack overflow:
> http://stackoverflow.com/questions/12444231/svdmatrixconditioninghowtoprojectfromoriginalspacetoconditionedspac
>
> 
> Lance Norskog
