Given two side matrices P and Q, extract (find) the diagonal scaling matrix $Sigma$ of a singular value decomposition
Clash Royale CLAN TAG#URR8PPP
up vote
0
down vote
favorite
I have an application where I have already approximated a given matrix R of size (m,n) by multiplying two matrices P and Q-transpose: $Rhat=P*Q^t$. P is size (m,k) and Q is size (n,k) and Q-transpose is size (k,n). I desire now to use these two matrices to efficiently as possible find a proper singular value decomposition, which has three matrices as you know.
What gives me great hope is that Simon Funk said here (http://sifter.org/~simon/journal/20061211.html) that "The end result, it's worth noting, is exactly an SVD if the training set perfectly covers the matrix. Call it what you will when it doesn't. (If you're wondering where the diagonal scaling matrix is, it gets arbitrarily rolled in to the two side matrices, but could be trivially extracted if needed.)"
Can someone describe and detail the trivial extraction process he talked about which I can use to find that third matrix $Sigma$ in the famous SVD equation $Rhat = U*Sigma*V^t$?
Never mind FunkSVD, as I am not using that algorithm currently, but I do have a pretty well-estimated pair of matrices P and Q as my starting point. I used a gradient descent and machine learning to get P and Q already.
I am required to NOT run SVD from scratch -- instead I must do something very efficient to "trivially extract" the sigma matrix, when given "two side matrices", which dear Mr. Funk said is possible.
Thanks for contributions if any!
matrix-decomposition
add a comment |Â
up vote
0
down vote
favorite
I have an application where I have already approximated a given matrix R of size (m,n) by multiplying two matrices P and Q-transpose: $Rhat=P*Q^t$. P is size (m,k) and Q is size (n,k) and Q-transpose is size (k,n). I desire now to use these two matrices to efficiently as possible find a proper singular value decomposition, which has three matrices as you know.
What gives me great hope is that Simon Funk said here (http://sifter.org/~simon/journal/20061211.html) that "The end result, it's worth noting, is exactly an SVD if the training set perfectly covers the matrix. Call it what you will when it doesn't. (If you're wondering where the diagonal scaling matrix is, it gets arbitrarily rolled in to the two side matrices, but could be trivially extracted if needed.)"
Can someone describe and detail the trivial extraction process he talked about which I can use to find that third matrix $Sigma$ in the famous SVD equation $Rhat = U*Sigma*V^t$?
Never mind FunkSVD, as I am not using that algorithm currently, but I do have a pretty well-estimated pair of matrices P and Q as my starting point. I used a gradient descent and machine learning to get P and Q already.
I am required to NOT run SVD from scratch -- instead I must do something very efficient to "trivially extract" the sigma matrix, when given "two side matrices", which dear Mr. Funk said is possible.
Thanks for contributions if any!
matrix-decomposition
add a comment |Â
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I have an application where I have already approximated a given matrix R of size (m,n) by multiplying two matrices P and Q-transpose: $Rhat=P*Q^t$. P is size (m,k) and Q is size (n,k) and Q-transpose is size (k,n). I desire now to use these two matrices to efficiently as possible find a proper singular value decomposition, which has three matrices as you know.
What gives me great hope is that Simon Funk said here (http://sifter.org/~simon/journal/20061211.html) that "The end result, it's worth noting, is exactly an SVD if the training set perfectly covers the matrix. Call it what you will when it doesn't. (If you're wondering where the diagonal scaling matrix is, it gets arbitrarily rolled in to the two side matrices, but could be trivially extracted if needed.)"
Can someone describe and detail the trivial extraction process he talked about which I can use to find that third matrix $Sigma$ in the famous SVD equation $Rhat = U*Sigma*V^t$?
Never mind FunkSVD, as I am not using that algorithm currently, but I do have a pretty well-estimated pair of matrices P and Q as my starting point. I used a gradient descent and machine learning to get P and Q already.
I am required to NOT run SVD from scratch -- instead I must do something very efficient to "trivially extract" the sigma matrix, when given "two side matrices", which dear Mr. Funk said is possible.
Thanks for contributions if any!
matrix-decomposition
I have an application where I have already approximated a given matrix R of size (m,n) by multiplying two matrices P and Q-transpose: $Rhat=P*Q^t$. P is size (m,k) and Q is size (n,k) and Q-transpose is size (k,n). I desire now to use these two matrices to efficiently as possible find a proper singular value decomposition, which has three matrices as you know.
What gives me great hope is that Simon Funk said here (http://sifter.org/~simon/journal/20061211.html) that "The end result, it's worth noting, is exactly an SVD if the training set perfectly covers the matrix. Call it what you will when it doesn't. (If you're wondering where the diagonal scaling matrix is, it gets arbitrarily rolled in to the two side matrices, but could be trivially extracted if needed.)"
Can someone describe and detail the trivial extraction process he talked about which I can use to find that third matrix $Sigma$ in the famous SVD equation $Rhat = U*Sigma*V^t$?
Never mind FunkSVD, as I am not using that algorithm currently, but I do have a pretty well-estimated pair of matrices P and Q as my starting point. I used a gradient descent and machine learning to get P and Q already.
I am required to NOT run SVD from scratch -- instead I must do something very efficient to "trivially extract" the sigma matrix, when given "two side matrices", which dear Mr. Funk said is possible.
Thanks for contributions if any!
matrix-decomposition
edited Aug 2 at 21:12
asked Aug 2 at 19:30
Geoffrey Anderson
1011
1011
add a comment |Â
add a comment |Â
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2870425%2fgiven-two-side-matrices-p-and-q-extract-find-the-diagonal-scaling-matrix-si%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password