Conditions for Matrix to be Product of Near-Identity Matrices

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
1
down vote

favorite












For $epsilon > 0$, let $M_epsilon$ be the family of $n$ x $n$ real matrices A such that $||$A$ - $I$_n|| < epsilon$, where $|| cdot ||$ is the standard operator norm. If $epsilon$ is chosen sufficiently small, then all finite products of members of $M_epsilon$ have positive determinant (i.e., they are orientation-preserving). Is this the only requirement for an $n$ x $n$ matrix to be expressible as such a product? If so, that would imply the result that any non-singular $n$ x $n$ matrix can be expressed as a product of $n$ x $n$ matrices that each change only one coordinate (as this is clearly the case for any matrix in $M_epsilon$), which is what I'm trying to prove.







share|cite|improve this question























    up vote
    1
    down vote

    favorite












    For $epsilon > 0$, let $M_epsilon$ be the family of $n$ x $n$ real matrices A such that $||$A$ - $I$_n|| < epsilon$, where $|| cdot ||$ is the standard operator norm. If $epsilon$ is chosen sufficiently small, then all finite products of members of $M_epsilon$ have positive determinant (i.e., they are orientation-preserving). Is this the only requirement for an $n$ x $n$ matrix to be expressible as such a product? If so, that would imply the result that any non-singular $n$ x $n$ matrix can be expressed as a product of $n$ x $n$ matrices that each change only one coordinate (as this is clearly the case for any matrix in $M_epsilon$), which is what I'm trying to prove.







    share|cite|improve this question





















      up vote
      1
      down vote

      favorite









      up vote
      1
      down vote

      favorite











      For $epsilon > 0$, let $M_epsilon$ be the family of $n$ x $n$ real matrices A such that $||$A$ - $I$_n|| < epsilon$, where $|| cdot ||$ is the standard operator norm. If $epsilon$ is chosen sufficiently small, then all finite products of members of $M_epsilon$ have positive determinant (i.e., they are orientation-preserving). Is this the only requirement for an $n$ x $n$ matrix to be expressible as such a product? If so, that would imply the result that any non-singular $n$ x $n$ matrix can be expressed as a product of $n$ x $n$ matrices that each change only one coordinate (as this is clearly the case for any matrix in $M_epsilon$), which is what I'm trying to prove.







      share|cite|improve this question











      For $epsilon > 0$, let $M_epsilon$ be the family of $n$ x $n$ real matrices A such that $||$A$ - $I$_n|| < epsilon$, where $|| cdot ||$ is the standard operator norm. If $epsilon$ is chosen sufficiently small, then all finite products of members of $M_epsilon$ have positive determinant (i.e., they are orientation-preserving). Is this the only requirement for an $n$ x $n$ matrix to be expressible as such a product? If so, that would imply the result that any non-singular $n$ x $n$ matrix can be expressed as a product of $n$ x $n$ matrices that each change only one coordinate (as this is clearly the case for any matrix in $M_epsilon$), which is what I'm trying to prove.









      share|cite|improve this question










      share|cite|improve this question




      share|cite|improve this question









      asked Aug 5 at 23:19









      Davey

      43827




      43827




















          1 Answer
          1






          active

          oldest

          votes

















          up vote
          2
          down vote













          Let $A in mathbbR^n times n$



          Now if $|A- I_n | < epsilon $ then I can express $A$ as product of two nearly orthogonal matrices. Right.



          An orthogonal matrix is $QQ^T = Q^TQ = I_n $ now...each column of $Q$ is unit normal. So if we build an orthogonal matrix and alter slightly we manipulate the bounds on $epsilon$ Like the following..



          n= 3;
          A = rand(n,n);
          [Q,R] =qr(A);
          I = eye(n);
          err = norm(Q*Q' - I);


          now this is zero...for instance...



          epsilon = 5;

          Q1 = epsilon*Q(:,3);
          Q1 = [Q(:,1),Q(:,2),Q1];
          err1 = norm(Q1*Q1' -I)

          err1 =

          24.0000


          From the matrix norms it slightly less than 25...like I expected. This comes from the matrix norm equality



          $$ |AB | leq |A| |B|$$



          and
          $$ | c A| leq |c| | A |$$



          illustrating that this bounds it closer change epsilon to $1$



          n= 3;
          A = rand(n,n);
          [Q,R] =qr(A);
          I = eye(n);
          err = norm(Q*Q' - I);

          epsilon = 1;

          Q1 = epsilon*Q(:,3);
          Q1 = [Q(:,1),Q(:,2),Q1];
          err1 = norm(Q1*Q1' -I);

          err1 =

          5.1650e-16


          Q1 = epsilon*Q(3,3);
          Q2 = Q;
          Q2(3,3) = Q1;
          err1 = norm(Q2*Q2' - I)


          Note that since 1 doesn't modify anything it will be close to machine precision or our $epsilon$



          In retrospect that was kind of dumb. We're going to create a matrix from the outer product of two other and subtract it from $I_n$



          $$A =I_n - vu^t $$



          let $vu^T_ij = 0 , vu^T_i=j=1 = epsilon , $



          So you have a zero matrix we create we subtract off epsilon from the identity.



          $$ | A - I_n | = epsilon $$



          $$ | I_n - vu^t -I_n | = | vu^T | = epsilon $$



          we can demonstrate this like the following..



          n=3;
          I =eye(n);
          Z = zeros(n);
          epsilon = 1e-3;
          Z(1,1) = epsilon;
          A = I-Z;

          error = norm(A-I);

          error =

          0.0010


          So you simply create an $epsilon$ and make it smaller.






          share|cite|improve this answer























          • I'm afraid I don't see how I could use this to answer my question. If an orientation-preserving matrix A is really far away from the identity, how does its QR decomposition help me get it as a product of matrices close to the identity?
            – Davey
            Aug 6 at 18:01











          • I don't understand your concern. The QR decomposition is gram schmidt. I produced a matrix $Q$ which is orthogonal. I've demonstrated that with both code and math. Also there are two parts..one where I create a diagonal matrix like you state because what you're saying is confused. If you subtract $epsilon $ from identity in the first spot then subtract that from $I_n$ you get $epsilon$ for the norm.
            – RHowe
            Aug 7 at 0:01










          • your constraint is to make something called $epsilon$ typically people it is very close it can be any number. further more determinant of orthogonal matrices is 1.
            – RHowe
            Aug 7 at 0:15











          Your Answer




          StackExchange.ifUsing("editor", function ()
          return StackExchange.using("mathjaxEditing", function ()
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          );
          );
          , "mathjax-editing");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "69"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          convertImagesToLinks: true,
          noModals: false,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );








           

          draft saved


          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2873424%2fconditions-for-matrix-to-be-product-of-near-identity-matrices%23new-answer', 'question_page');

          );

          Post as a guest






























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes








          up vote
          2
          down vote













          Let $A in mathbbR^n times n$



          Now if $|A- I_n | < epsilon $ then I can express $A$ as product of two nearly orthogonal matrices. Right.



          An orthogonal matrix is $QQ^T = Q^TQ = I_n $ now...each column of $Q$ is unit normal. So if we build an orthogonal matrix and alter slightly we manipulate the bounds on $epsilon$ Like the following..



          n= 3;
          A = rand(n,n);
          [Q,R] =qr(A);
          I = eye(n);
          err = norm(Q*Q' - I);


          now this is zero...for instance...



          epsilon = 5;

          Q1 = epsilon*Q(:,3);
          Q1 = [Q(:,1),Q(:,2),Q1];
          err1 = norm(Q1*Q1' -I)

          err1 =

          24.0000


          From the matrix norms it slightly less than 25...like I expected. This comes from the matrix norm equality



          $$ |AB | leq |A| |B|$$



          and
          $$ | c A| leq |c| | A |$$



          illustrating that this bounds it closer change epsilon to $1$



          n= 3;
          A = rand(n,n);
          [Q,R] =qr(A);
          I = eye(n);
          err = norm(Q*Q' - I);

          epsilon = 1;

          Q1 = epsilon*Q(:,3);
          Q1 = [Q(:,1),Q(:,2),Q1];
          err1 = norm(Q1*Q1' -I);

          err1 =

          5.1650e-16


          Q1 = epsilon*Q(3,3);
          Q2 = Q;
          Q2(3,3) = Q1;
          err1 = norm(Q2*Q2' - I)


          Note that since 1 doesn't modify anything it will be close to machine precision or our $epsilon$



          In retrospect that was kind of dumb. We're going to create a matrix from the outer product of two other and subtract it from $I_n$



          $$A =I_n - vu^t $$



          let $vu^T_ij = 0 , vu^T_i=j=1 = epsilon , $



          So you have a zero matrix we create we subtract off epsilon from the identity.



          $$ | A - I_n | = epsilon $$



          $$ | I_n - vu^t -I_n | = | vu^T | = epsilon $$



          we can demonstrate this like the following..



          n=3;
          I =eye(n);
          Z = zeros(n);
          epsilon = 1e-3;
          Z(1,1) = epsilon;
          A = I-Z;

          error = norm(A-I);

          error =

          0.0010


          So you simply create an $epsilon$ and make it smaller.






          share|cite|improve this answer























          • I'm afraid I don't see how I could use this to answer my question. If an orientation-preserving matrix A is really far away from the identity, how does its QR decomposition help me get it as a product of matrices close to the identity?
            – Davey
            Aug 6 at 18:01











          • I don't understand your concern. The QR decomposition is gram schmidt. I produced a matrix $Q$ which is orthogonal. I've demonstrated that with both code and math. Also there are two parts..one where I create a diagonal matrix like you state because what you're saying is confused. If you subtract $epsilon $ from identity in the first spot then subtract that from $I_n$ you get $epsilon$ for the norm.
            – RHowe
            Aug 7 at 0:01










          • your constraint is to make something called $epsilon$ typically people it is very close it can be any number. further more determinant of orthogonal matrices is 1.
            – RHowe
            Aug 7 at 0:15















          up vote
          2
          down vote













          Let $A in mathbbR^n times n$



          Now if $|A- I_n | < epsilon $ then I can express $A$ as product of two nearly orthogonal matrices. Right.



          An orthogonal matrix is $QQ^T = Q^TQ = I_n $ now...each column of $Q$ is unit normal. So if we build an orthogonal matrix and alter slightly we manipulate the bounds on $epsilon$ Like the following..



          n= 3;
          A = rand(n,n);
          [Q,R] =qr(A);
          I = eye(n);
          err = norm(Q*Q' - I);


          now this is zero...for instance...



          epsilon = 5;

          Q1 = epsilon*Q(:,3);
          Q1 = [Q(:,1),Q(:,2),Q1];
          err1 = norm(Q1*Q1' -I)

          err1 =

          24.0000


          From the matrix norms it slightly less than 25...like I expected. This comes from the matrix norm equality



          $$ |AB | leq |A| |B|$$



          and
          $$ | c A| leq |c| | A |$$



          illustrating that this bounds it closer change epsilon to $1$



          n= 3;
          A = rand(n,n);
          [Q,R] =qr(A);
          I = eye(n);
          err = norm(Q*Q' - I);

          epsilon = 1;

          Q1 = epsilon*Q(:,3);
          Q1 = [Q(:,1),Q(:,2),Q1];
          err1 = norm(Q1*Q1' -I);

          err1 =

          5.1650e-16


          Q1 = epsilon*Q(3,3);
          Q2 = Q;
          Q2(3,3) = Q1;
          err1 = norm(Q2*Q2' - I)


          Note that since 1 doesn't modify anything it will be close to machine precision or our $epsilon$



          In retrospect that was kind of dumb. We're going to create a matrix from the outer product of two other and subtract it from $I_n$



          $$A =I_n - vu^t $$



          let $vu^T_ij = 0 , vu^T_i=j=1 = epsilon , $



          So you have a zero matrix we create we subtract off epsilon from the identity.



          $$ | A - I_n | = epsilon $$



          $$ | I_n - vu^t -I_n | = | vu^T | = epsilon $$



          we can demonstrate this like the following..



          n=3;
          I =eye(n);
          Z = zeros(n);
          epsilon = 1e-3;
          Z(1,1) = epsilon;
          A = I-Z;

          error = norm(A-I);

          error =

          0.0010


          So you simply create an $epsilon$ and make it smaller.






          share|cite|improve this answer























          • I'm afraid I don't see how I could use this to answer my question. If an orientation-preserving matrix A is really far away from the identity, how does its QR decomposition help me get it as a product of matrices close to the identity?
            – Davey
            Aug 6 at 18:01











          • I don't understand your concern. The QR decomposition is gram schmidt. I produced a matrix $Q$ which is orthogonal. I've demonstrated that with both code and math. Also there are two parts..one where I create a diagonal matrix like you state because what you're saying is confused. If you subtract $epsilon $ from identity in the first spot then subtract that from $I_n$ you get $epsilon$ for the norm.
            – RHowe
            Aug 7 at 0:01










          • your constraint is to make something called $epsilon$ typically people it is very close it can be any number. further more determinant of orthogonal matrices is 1.
            – RHowe
            Aug 7 at 0:15













          up vote
          2
          down vote










          up vote
          2
          down vote









          Let $A in mathbbR^n times n$



          Now if $|A- I_n | < epsilon $ then I can express $A$ as product of two nearly orthogonal matrices. Right.



          An orthogonal matrix is $QQ^T = Q^TQ = I_n $ now...each column of $Q$ is unit normal. So if we build an orthogonal matrix and alter slightly we manipulate the bounds on $epsilon$ Like the following..



          n= 3;
          A = rand(n,n);
          [Q,R] =qr(A);
          I = eye(n);
          err = norm(Q*Q' - I);


          now this is zero...for instance...



          epsilon = 5;

          Q1 = epsilon*Q(:,3);
          Q1 = [Q(:,1),Q(:,2),Q1];
          err1 = norm(Q1*Q1' -I)

          err1 =

          24.0000


          From the matrix norms it slightly less than 25...like I expected. This comes from the matrix norm equality



          $$ |AB | leq |A| |B|$$



          and
          $$ | c A| leq |c| | A |$$



          illustrating that this bounds it closer change epsilon to $1$



          n= 3;
          A = rand(n,n);
          [Q,R] =qr(A);
          I = eye(n);
          err = norm(Q*Q' - I);

          epsilon = 1;

          Q1 = epsilon*Q(:,3);
          Q1 = [Q(:,1),Q(:,2),Q1];
          err1 = norm(Q1*Q1' -I);

          err1 =

          5.1650e-16


          Q1 = epsilon*Q(3,3);
          Q2 = Q;
          Q2(3,3) = Q1;
          err1 = norm(Q2*Q2' - I)


          Note that since 1 doesn't modify anything it will be close to machine precision or our $epsilon$



          In retrospect that was kind of dumb. We're going to create a matrix from the outer product of two other and subtract it from $I_n$



          $$A =I_n - vu^t $$



          let $vu^T_ij = 0 , vu^T_i=j=1 = epsilon , $



          So you have a zero matrix we create we subtract off epsilon from the identity.



          $$ | A - I_n | = epsilon $$



          $$ | I_n - vu^t -I_n | = | vu^T | = epsilon $$



          we can demonstrate this like the following..



          n=3;
          I =eye(n);
          Z = zeros(n);
          epsilon = 1e-3;
          Z(1,1) = epsilon;
          A = I-Z;

          error = norm(A-I);

          error =

          0.0010


          So you simply create an $epsilon$ and make it smaller.






          share|cite|improve this answer















          Let $A in mathbbR^n times n$



          Now if $|A- I_n | < epsilon $ then I can express $A$ as product of two nearly orthogonal matrices. Right.



          An orthogonal matrix is $QQ^T = Q^TQ = I_n $ now...each column of $Q$ is unit normal. So if we build an orthogonal matrix and alter slightly we manipulate the bounds on $epsilon$ Like the following..



          n= 3;
          A = rand(n,n);
          [Q,R] =qr(A);
          I = eye(n);
          err = norm(Q*Q' - I);


          now this is zero...for instance...



          epsilon = 5;

          Q1 = epsilon*Q(:,3);
          Q1 = [Q(:,1),Q(:,2),Q1];
          err1 = norm(Q1*Q1' -I)

          err1 =

          24.0000


          From the matrix norms it slightly less than 25...like I expected. This comes from the matrix norm equality



          $$ |AB | leq |A| |B|$$



          and
          $$ | c A| leq |c| | A |$$



          illustrating that this bounds it closer change epsilon to $1$



          n= 3;
          A = rand(n,n);
          [Q,R] =qr(A);
          I = eye(n);
          err = norm(Q*Q' - I);

          epsilon = 1;

          Q1 = epsilon*Q(:,3);
          Q1 = [Q(:,1),Q(:,2),Q1];
          err1 = norm(Q1*Q1' -I);

          err1 =

          5.1650e-16


          Q1 = epsilon*Q(3,3);
          Q2 = Q;
          Q2(3,3) = Q1;
          err1 = norm(Q2*Q2' - I)


          Note that since 1 doesn't modify anything it will be close to machine precision or our $epsilon$



          In retrospect that was kind of dumb. We're going to create a matrix from the outer product of two other and subtract it from $I_n$



          $$A =I_n - vu^t $$



          let $vu^T_ij = 0 , vu^T_i=j=1 = epsilon , $



          So you have a zero matrix we create we subtract off epsilon from the identity.



          $$ | A - I_n | = epsilon $$



          $$ | I_n - vu^t -I_n | = | vu^T | = epsilon $$



          we can demonstrate this like the following..



          n=3;
          I =eye(n);
          Z = zeros(n);
          epsilon = 1e-3;
          Z(1,1) = epsilon;
          A = I-Z;

          error = norm(A-I);

          error =

          0.0010


          So you simply create an $epsilon$ and make it smaller.







          share|cite|improve this answer















          share|cite|improve this answer



          share|cite|improve this answer








          edited Aug 6 at 4:08


























          answered Aug 6 at 1:02









          RHowe

          1,017815




          1,017815











          • I'm afraid I don't see how I could use this to answer my question. If an orientation-preserving matrix A is really far away from the identity, how does its QR decomposition help me get it as a product of matrices close to the identity?
            – Davey
            Aug 6 at 18:01











          • I don't understand your concern. The QR decomposition is gram schmidt. I produced a matrix $Q$ which is orthogonal. I've demonstrated that with both code and math. Also there are two parts..one where I create a diagonal matrix like you state because what you're saying is confused. If you subtract $epsilon $ from identity in the first spot then subtract that from $I_n$ you get $epsilon$ for the norm.
            – RHowe
            Aug 7 at 0:01










          • your constraint is to make something called $epsilon$ typically people it is very close it can be any number. further more determinant of orthogonal matrices is 1.
            – RHowe
            Aug 7 at 0:15

















          • I'm afraid I don't see how I could use this to answer my question. If an orientation-preserving matrix A is really far away from the identity, how does its QR decomposition help me get it as a product of matrices close to the identity?
            – Davey
            Aug 6 at 18:01











          • I don't understand your concern. The QR decomposition is gram schmidt. I produced a matrix $Q$ which is orthogonal. I've demonstrated that with both code and math. Also there are two parts..one where I create a diagonal matrix like you state because what you're saying is confused. If you subtract $epsilon $ from identity in the first spot then subtract that from $I_n$ you get $epsilon$ for the norm.
            – RHowe
            Aug 7 at 0:01










          • your constraint is to make something called $epsilon$ typically people it is very close it can be any number. further more determinant of orthogonal matrices is 1.
            – RHowe
            Aug 7 at 0:15
















          I'm afraid I don't see how I could use this to answer my question. If an orientation-preserving matrix A is really far away from the identity, how does its QR decomposition help me get it as a product of matrices close to the identity?
          – Davey
          Aug 6 at 18:01





          I'm afraid I don't see how I could use this to answer my question. If an orientation-preserving matrix A is really far away from the identity, how does its QR decomposition help me get it as a product of matrices close to the identity?
          – Davey
          Aug 6 at 18:01













          I don't understand your concern. The QR decomposition is gram schmidt. I produced a matrix $Q$ which is orthogonal. I've demonstrated that with both code and math. Also there are two parts..one where I create a diagonal matrix like you state because what you're saying is confused. If you subtract $epsilon $ from identity in the first spot then subtract that from $I_n$ you get $epsilon$ for the norm.
          – RHowe
          Aug 7 at 0:01




          I don't understand your concern. The QR decomposition is gram schmidt. I produced a matrix $Q$ which is orthogonal. I've demonstrated that with both code and math. Also there are two parts..one where I create a diagonal matrix like you state because what you're saying is confused. If you subtract $epsilon $ from identity in the first spot then subtract that from $I_n$ you get $epsilon$ for the norm.
          – RHowe
          Aug 7 at 0:01












          your constraint is to make something called $epsilon$ typically people it is very close it can be any number. further more determinant of orthogonal matrices is 1.
          – RHowe
          Aug 7 at 0:15





          your constraint is to make something called $epsilon$ typically people it is very close it can be any number. further more determinant of orthogonal matrices is 1.
          – RHowe
          Aug 7 at 0:15













           

          draft saved


          draft discarded


























           


          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2873424%2fconditions-for-matrix-to-be-product-of-near-identity-matrices%23new-answer', 'question_page');

          );

          Post as a guest













































































          Comments

          Popular posts from this blog

          What is the equation of a 3D cone with generalised tilt?

          Color the edges and diagonals of a regular polygon

          Relationship between determinant of matrix and determinant of adjoint?