Given two side matrices P and Q, extract (find) the diagonal scaling matrix $Sigma$ of a singular value decomposition

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
0
down vote

favorite












I have an application where I have already approximated a given matrix R of size (m,n) by multiplying two matrices P and Q-transpose: $Rhat=P*Q^t$. P is size (m,k) and Q is size (n,k) and Q-transpose is size (k,n). I desire now to use these two matrices to efficiently as possible find a proper singular value decomposition, which has three matrices as you know.



What gives me great hope is that Simon Funk said here (http://sifter.org/~simon/journal/20061211.html) that "The end result, it's worth noting, is exactly an SVD if the training set perfectly covers the matrix. Call it what you will when it doesn't. (If you're wondering where the diagonal scaling matrix is, it gets arbitrarily rolled in to the two side matrices, but could be trivially extracted if needed.)"



Can someone describe and detail the trivial extraction process he talked about which I can use to find that third matrix $Sigma$ in the famous SVD equation $Rhat = U*Sigma*V^t$?



Never mind FunkSVD, as I am not using that algorithm currently, but I do have a pretty well-estimated pair of matrices P and Q as my starting point. I used a gradient descent and machine learning to get P and Q already.



I am required to NOT run SVD from scratch -- instead I must do something very efficient to "trivially extract" the sigma matrix, when given "two side matrices", which dear Mr. Funk said is possible.



Thanks for contributions if any!







share|cite|improve this question

























    up vote
    0
    down vote

    favorite












    I have an application where I have already approximated a given matrix R of size (m,n) by multiplying two matrices P and Q-transpose: $Rhat=P*Q^t$. P is size (m,k) and Q is size (n,k) and Q-transpose is size (k,n). I desire now to use these two matrices to efficiently as possible find a proper singular value decomposition, which has three matrices as you know.



    What gives me great hope is that Simon Funk said here (http://sifter.org/~simon/journal/20061211.html) that "The end result, it's worth noting, is exactly an SVD if the training set perfectly covers the matrix. Call it what you will when it doesn't. (If you're wondering where the diagonal scaling matrix is, it gets arbitrarily rolled in to the two side matrices, but could be trivially extracted if needed.)"



    Can someone describe and detail the trivial extraction process he talked about which I can use to find that third matrix $Sigma$ in the famous SVD equation $Rhat = U*Sigma*V^t$?



    Never mind FunkSVD, as I am not using that algorithm currently, but I do have a pretty well-estimated pair of matrices P and Q as my starting point. I used a gradient descent and machine learning to get P and Q already.



    I am required to NOT run SVD from scratch -- instead I must do something very efficient to "trivially extract" the sigma matrix, when given "two side matrices", which dear Mr. Funk said is possible.



    Thanks for contributions if any!







    share|cite|improve this question























      up vote
      0
      down vote

      favorite









      up vote
      0
      down vote

      favorite











      I have an application where I have already approximated a given matrix R of size (m,n) by multiplying two matrices P and Q-transpose: $Rhat=P*Q^t$. P is size (m,k) and Q is size (n,k) and Q-transpose is size (k,n). I desire now to use these two matrices to efficiently as possible find a proper singular value decomposition, which has three matrices as you know.



      What gives me great hope is that Simon Funk said here (http://sifter.org/~simon/journal/20061211.html) that "The end result, it's worth noting, is exactly an SVD if the training set perfectly covers the matrix. Call it what you will when it doesn't. (If you're wondering where the diagonal scaling matrix is, it gets arbitrarily rolled in to the two side matrices, but could be trivially extracted if needed.)"



      Can someone describe and detail the trivial extraction process he talked about which I can use to find that third matrix $Sigma$ in the famous SVD equation $Rhat = U*Sigma*V^t$?



      Never mind FunkSVD, as I am not using that algorithm currently, but I do have a pretty well-estimated pair of matrices P and Q as my starting point. I used a gradient descent and machine learning to get P and Q already.



      I am required to NOT run SVD from scratch -- instead I must do something very efficient to "trivially extract" the sigma matrix, when given "two side matrices", which dear Mr. Funk said is possible.



      Thanks for contributions if any!







      share|cite|improve this question













      I have an application where I have already approximated a given matrix R of size (m,n) by multiplying two matrices P and Q-transpose: $Rhat=P*Q^t$. P is size (m,k) and Q is size (n,k) and Q-transpose is size (k,n). I desire now to use these two matrices to efficiently as possible find a proper singular value decomposition, which has three matrices as you know.



      What gives me great hope is that Simon Funk said here (http://sifter.org/~simon/journal/20061211.html) that "The end result, it's worth noting, is exactly an SVD if the training set perfectly covers the matrix. Call it what you will when it doesn't. (If you're wondering where the diagonal scaling matrix is, it gets arbitrarily rolled in to the two side matrices, but could be trivially extracted if needed.)"



      Can someone describe and detail the trivial extraction process he talked about which I can use to find that third matrix $Sigma$ in the famous SVD equation $Rhat = U*Sigma*V^t$?



      Never mind FunkSVD, as I am not using that algorithm currently, but I do have a pretty well-estimated pair of matrices P and Q as my starting point. I used a gradient descent and machine learning to get P and Q already.



      I am required to NOT run SVD from scratch -- instead I must do something very efficient to "trivially extract" the sigma matrix, when given "two side matrices", which dear Mr. Funk said is possible.



      Thanks for contributions if any!









      share|cite|improve this question












      share|cite|improve this question




      share|cite|improve this question








      edited Aug 2 at 21:12
























      asked Aug 2 at 19:30









      Geoffrey Anderson

      1011




      1011

























          active

          oldest

          votes











          Your Answer




          StackExchange.ifUsing("editor", function ()
          return StackExchange.using("mathjaxEditing", function ()
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          );
          );
          , "mathjax-editing");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "69"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          convertImagesToLinks: true,
          noModals: false,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );








           

          draft saved


          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2870425%2fgiven-two-side-matrices-p-and-q-extract-find-the-diagonal-scaling-matrix-si%23new-answer', 'question_page');

          );

          Post as a guest



































          active

          oldest

          votes













          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes










           

          draft saved


          draft discarded


























           


          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2870425%2fgiven-two-side-matrices-p-and-q-extract-find-the-diagonal-scaling-matrix-si%23new-answer', 'question_page');

          );

          Post as a guest













































































          Comments

          Popular posts from this blog

          What is the equation of a 3D cone with generalised tilt?

          Color the edges and diagonals of a regular polygon

          Relationship between determinant of matrix and determinant of adjoint?