Conditions for Matrix to be Product of Near-Identity Matrices
Clash Royale CLAN TAG#URR8PPP
up vote
1
down vote
favorite
For $epsilon > 0$, let $M_epsilon$ be the family of $n$ x $n$ real matrices A such that $||$A$ - $I$_n|| < epsilon$, where $|| cdot ||$ is the standard operator norm. If $epsilon$ is chosen sufficiently small, then all finite products of members of $M_epsilon$ have positive determinant (i.e., they are orientation-preserving). Is this the only requirement for an $n$ x $n$ matrix to be expressible as such a product? If so, that would imply the result that any non-singular $n$ x $n$ matrix can be expressed as a product of $n$ x $n$ matrices that each change only one coordinate (as this is clearly the case for any matrix in $M_epsilon$), which is what I'm trying to prove.
matrices linear-transformations matrix-decomposition orientation
add a comment |Â
up vote
1
down vote
favorite
For $epsilon > 0$, let $M_epsilon$ be the family of $n$ x $n$ real matrices A such that $||$A$ - $I$_n|| < epsilon$, where $|| cdot ||$ is the standard operator norm. If $epsilon$ is chosen sufficiently small, then all finite products of members of $M_epsilon$ have positive determinant (i.e., they are orientation-preserving). Is this the only requirement for an $n$ x $n$ matrix to be expressible as such a product? If so, that would imply the result that any non-singular $n$ x $n$ matrix can be expressed as a product of $n$ x $n$ matrices that each change only one coordinate (as this is clearly the case for any matrix in $M_epsilon$), which is what I'm trying to prove.
matrices linear-transformations matrix-decomposition orientation
add a comment |Â
up vote
1
down vote
favorite
up vote
1
down vote
favorite
For $epsilon > 0$, let $M_epsilon$ be the family of $n$ x $n$ real matrices A such that $||$A$ - $I$_n|| < epsilon$, where $|| cdot ||$ is the standard operator norm. If $epsilon$ is chosen sufficiently small, then all finite products of members of $M_epsilon$ have positive determinant (i.e., they are orientation-preserving). Is this the only requirement for an $n$ x $n$ matrix to be expressible as such a product? If so, that would imply the result that any non-singular $n$ x $n$ matrix can be expressed as a product of $n$ x $n$ matrices that each change only one coordinate (as this is clearly the case for any matrix in $M_epsilon$), which is what I'm trying to prove.
matrices linear-transformations matrix-decomposition orientation
For $epsilon > 0$, let $M_epsilon$ be the family of $n$ x $n$ real matrices A such that $||$A$ - $I$_n|| < epsilon$, where $|| cdot ||$ is the standard operator norm. If $epsilon$ is chosen sufficiently small, then all finite products of members of $M_epsilon$ have positive determinant (i.e., they are orientation-preserving). Is this the only requirement for an $n$ x $n$ matrix to be expressible as such a product? If so, that would imply the result that any non-singular $n$ x $n$ matrix can be expressed as a product of $n$ x $n$ matrices that each change only one coordinate (as this is clearly the case for any matrix in $M_epsilon$), which is what I'm trying to prove.
matrices linear-transformations matrix-decomposition orientation
asked Aug 5 at 23:19
Davey
43827
43827
add a comment |Â
add a comment |Â
1 Answer
1
active
oldest
votes
up vote
2
down vote
Let $A in mathbbR^n times n$
Now if $|A- I_n | < epsilon $ then I can express $A$ as product of two nearly orthogonal matrices. Right.
An orthogonal matrix is $QQ^T = Q^TQ = I_n $ now...each column of $Q$ is unit normal. So if we build an orthogonal matrix and alter slightly we manipulate the bounds on $epsilon$ Like the following..
n= 3;
A = rand(n,n);
[Q,R] =qr(A);
I = eye(n);
err = norm(Q*Q' - I);
now this is zero...for instance...
epsilon = 5;
Q1 = epsilon*Q(:,3);
Q1 = [Q(:,1),Q(:,2),Q1];
err1 = norm(Q1*Q1' -I)
err1 =
24.0000
From the matrix norms it slightly less than 25...like I expected. This comes from the matrix norm equality
$$ |AB | leq |A| |B|$$
and
$$ | c A| leq |c| | A |$$
illustrating that this bounds it closer change epsilon to $1$
n= 3;
A = rand(n,n);
[Q,R] =qr(A);
I = eye(n);
err = norm(Q*Q' - I);
epsilon = 1;
Q1 = epsilon*Q(:,3);
Q1 = [Q(:,1),Q(:,2),Q1];
err1 = norm(Q1*Q1' -I);
err1 =
5.1650e-16
Q1 = epsilon*Q(3,3);
Q2 = Q;
Q2(3,3) = Q1;
err1 = norm(Q2*Q2' - I)
Note that since 1 doesn't modify anything it will be close to machine precision or our $epsilon$
In retrospect that was kind of dumb. We're going to create a matrix from the outer product of two other and subtract it from $I_n$
$$A =I_n - vu^t $$
let $vu^T_ij = 0 , vu^T_i=j=1 = epsilon , $
So you have a zero matrix we create we subtract off epsilon from the identity.
$$ | A - I_n | = epsilon $$
$$ | I_n - vu^t -I_n | = | vu^T | = epsilon $$
we can demonstrate this like the following..
n=3;
I =eye(n);
Z = zeros(n);
epsilon = 1e-3;
Z(1,1) = epsilon;
A = I-Z;
error = norm(A-I);
error =
0.0010
So you simply create an $epsilon$ and make it smaller.
I'm afraid I don't see how I could use this to answer my question. If an orientation-preserving matrix A is really far away from the identity, how does its QR decomposition help me get it as a product of matrices close to the identity?
â Davey
Aug 6 at 18:01
I don't understand your concern. The QR decomposition is gram schmidt. I produced a matrix $Q$ which is orthogonal. I've demonstrated that with both code and math. Also there are two parts..one where I create a diagonal matrix like you state because what you're saying is confused. If you subtract $epsilon $ from identity in the first spot then subtract that from $I_n$ you get $epsilon$ for the norm.
â RHowe
Aug 7 at 0:01
your constraint is to make something called $epsilon$ typically people it is very close it can be any number. further more determinant of orthogonal matrices is 1.
â RHowe
Aug 7 at 0:15
add a comment |Â
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
2
down vote
Let $A in mathbbR^n times n$
Now if $|A- I_n | < epsilon $ then I can express $A$ as product of two nearly orthogonal matrices. Right.
An orthogonal matrix is $QQ^T = Q^TQ = I_n $ now...each column of $Q$ is unit normal. So if we build an orthogonal matrix and alter slightly we manipulate the bounds on $epsilon$ Like the following..
n= 3;
A = rand(n,n);
[Q,R] =qr(A);
I = eye(n);
err = norm(Q*Q' - I);
now this is zero...for instance...
epsilon = 5;
Q1 = epsilon*Q(:,3);
Q1 = [Q(:,1),Q(:,2),Q1];
err1 = norm(Q1*Q1' -I)
err1 =
24.0000
From the matrix norms it slightly less than 25...like I expected. This comes from the matrix norm equality
$$ |AB | leq |A| |B|$$
and
$$ | c A| leq |c| | A |$$
illustrating that this bounds it closer change epsilon to $1$
n= 3;
A = rand(n,n);
[Q,R] =qr(A);
I = eye(n);
err = norm(Q*Q' - I);
epsilon = 1;
Q1 = epsilon*Q(:,3);
Q1 = [Q(:,1),Q(:,2),Q1];
err1 = norm(Q1*Q1' -I);
err1 =
5.1650e-16
Q1 = epsilon*Q(3,3);
Q2 = Q;
Q2(3,3) = Q1;
err1 = norm(Q2*Q2' - I)
Note that since 1 doesn't modify anything it will be close to machine precision or our $epsilon$
In retrospect that was kind of dumb. We're going to create a matrix from the outer product of two other and subtract it from $I_n$
$$A =I_n - vu^t $$
let $vu^T_ij = 0 , vu^T_i=j=1 = epsilon , $
So you have a zero matrix we create we subtract off epsilon from the identity.
$$ | A - I_n | = epsilon $$
$$ | I_n - vu^t -I_n | = | vu^T | = epsilon $$
we can demonstrate this like the following..
n=3;
I =eye(n);
Z = zeros(n);
epsilon = 1e-3;
Z(1,1) = epsilon;
A = I-Z;
error = norm(A-I);
error =
0.0010
So you simply create an $epsilon$ and make it smaller.
I'm afraid I don't see how I could use this to answer my question. If an orientation-preserving matrix A is really far away from the identity, how does its QR decomposition help me get it as a product of matrices close to the identity?
â Davey
Aug 6 at 18:01
I don't understand your concern. The QR decomposition is gram schmidt. I produced a matrix $Q$ which is orthogonal. I've demonstrated that with both code and math. Also there are two parts..one where I create a diagonal matrix like you state because what you're saying is confused. If you subtract $epsilon $ from identity in the first spot then subtract that from $I_n$ you get $epsilon$ for the norm.
â RHowe
Aug 7 at 0:01
your constraint is to make something called $epsilon$ typically people it is very close it can be any number. further more determinant of orthogonal matrices is 1.
â RHowe
Aug 7 at 0:15
add a comment |Â
up vote
2
down vote
Let $A in mathbbR^n times n$
Now if $|A- I_n | < epsilon $ then I can express $A$ as product of two nearly orthogonal matrices. Right.
An orthogonal matrix is $QQ^T = Q^TQ = I_n $ now...each column of $Q$ is unit normal. So if we build an orthogonal matrix and alter slightly we manipulate the bounds on $epsilon$ Like the following..
n= 3;
A = rand(n,n);
[Q,R] =qr(A);
I = eye(n);
err = norm(Q*Q' - I);
now this is zero...for instance...
epsilon = 5;
Q1 = epsilon*Q(:,3);
Q1 = [Q(:,1),Q(:,2),Q1];
err1 = norm(Q1*Q1' -I)
err1 =
24.0000
From the matrix norms it slightly less than 25...like I expected. This comes from the matrix norm equality
$$ |AB | leq |A| |B|$$
and
$$ | c A| leq |c| | A |$$
illustrating that this bounds it closer change epsilon to $1$
n= 3;
A = rand(n,n);
[Q,R] =qr(A);
I = eye(n);
err = norm(Q*Q' - I);
epsilon = 1;
Q1 = epsilon*Q(:,3);
Q1 = [Q(:,1),Q(:,2),Q1];
err1 = norm(Q1*Q1' -I);
err1 =
5.1650e-16
Q1 = epsilon*Q(3,3);
Q2 = Q;
Q2(3,3) = Q1;
err1 = norm(Q2*Q2' - I)
Note that since 1 doesn't modify anything it will be close to machine precision or our $epsilon$
In retrospect that was kind of dumb. We're going to create a matrix from the outer product of two other and subtract it from $I_n$
$$A =I_n - vu^t $$
let $vu^T_ij = 0 , vu^T_i=j=1 = epsilon , $
So you have a zero matrix we create we subtract off epsilon from the identity.
$$ | A - I_n | = epsilon $$
$$ | I_n - vu^t -I_n | = | vu^T | = epsilon $$
we can demonstrate this like the following..
n=3;
I =eye(n);
Z = zeros(n);
epsilon = 1e-3;
Z(1,1) = epsilon;
A = I-Z;
error = norm(A-I);
error =
0.0010
So you simply create an $epsilon$ and make it smaller.
I'm afraid I don't see how I could use this to answer my question. If an orientation-preserving matrix A is really far away from the identity, how does its QR decomposition help me get it as a product of matrices close to the identity?
â Davey
Aug 6 at 18:01
I don't understand your concern. The QR decomposition is gram schmidt. I produced a matrix $Q$ which is orthogonal. I've demonstrated that with both code and math. Also there are two parts..one where I create a diagonal matrix like you state because what you're saying is confused. If you subtract $epsilon $ from identity in the first spot then subtract that from $I_n$ you get $epsilon$ for the norm.
â RHowe
Aug 7 at 0:01
your constraint is to make something called $epsilon$ typically people it is very close it can be any number. further more determinant of orthogonal matrices is 1.
â RHowe
Aug 7 at 0:15
add a comment |Â
up vote
2
down vote
up vote
2
down vote
Let $A in mathbbR^n times n$
Now if $|A- I_n | < epsilon $ then I can express $A$ as product of two nearly orthogonal matrices. Right.
An orthogonal matrix is $QQ^T = Q^TQ = I_n $ now...each column of $Q$ is unit normal. So if we build an orthogonal matrix and alter slightly we manipulate the bounds on $epsilon$ Like the following..
n= 3;
A = rand(n,n);
[Q,R] =qr(A);
I = eye(n);
err = norm(Q*Q' - I);
now this is zero...for instance...
epsilon = 5;
Q1 = epsilon*Q(:,3);
Q1 = [Q(:,1),Q(:,2),Q1];
err1 = norm(Q1*Q1' -I)
err1 =
24.0000
From the matrix norms it slightly less than 25...like I expected. This comes from the matrix norm equality
$$ |AB | leq |A| |B|$$
and
$$ | c A| leq |c| | A |$$
illustrating that this bounds it closer change epsilon to $1$
n= 3;
A = rand(n,n);
[Q,R] =qr(A);
I = eye(n);
err = norm(Q*Q' - I);
epsilon = 1;
Q1 = epsilon*Q(:,3);
Q1 = [Q(:,1),Q(:,2),Q1];
err1 = norm(Q1*Q1' -I);
err1 =
5.1650e-16
Q1 = epsilon*Q(3,3);
Q2 = Q;
Q2(3,3) = Q1;
err1 = norm(Q2*Q2' - I)
Note that since 1 doesn't modify anything it will be close to machine precision or our $epsilon$
In retrospect that was kind of dumb. We're going to create a matrix from the outer product of two other and subtract it from $I_n$
$$A =I_n - vu^t $$
let $vu^T_ij = 0 , vu^T_i=j=1 = epsilon , $
So you have a zero matrix we create we subtract off epsilon from the identity.
$$ | A - I_n | = epsilon $$
$$ | I_n - vu^t -I_n | = | vu^T | = epsilon $$
we can demonstrate this like the following..
n=3;
I =eye(n);
Z = zeros(n);
epsilon = 1e-3;
Z(1,1) = epsilon;
A = I-Z;
error = norm(A-I);
error =
0.0010
So you simply create an $epsilon$ and make it smaller.
Let $A in mathbbR^n times n$
Now if $|A- I_n | < epsilon $ then I can express $A$ as product of two nearly orthogonal matrices. Right.
An orthogonal matrix is $QQ^T = Q^TQ = I_n $ now...each column of $Q$ is unit normal. So if we build an orthogonal matrix and alter slightly we manipulate the bounds on $epsilon$ Like the following..
n= 3;
A = rand(n,n);
[Q,R] =qr(A);
I = eye(n);
err = norm(Q*Q' - I);
now this is zero...for instance...
epsilon = 5;
Q1 = epsilon*Q(:,3);
Q1 = [Q(:,1),Q(:,2),Q1];
err1 = norm(Q1*Q1' -I)
err1 =
24.0000
From the matrix norms it slightly less than 25...like I expected. This comes from the matrix norm equality
$$ |AB | leq |A| |B|$$
and
$$ | c A| leq |c| | A |$$
illustrating that this bounds it closer change epsilon to $1$
n= 3;
A = rand(n,n);
[Q,R] =qr(A);
I = eye(n);
err = norm(Q*Q' - I);
epsilon = 1;
Q1 = epsilon*Q(:,3);
Q1 = [Q(:,1),Q(:,2),Q1];
err1 = norm(Q1*Q1' -I);
err1 =
5.1650e-16
Q1 = epsilon*Q(3,3);
Q2 = Q;
Q2(3,3) = Q1;
err1 = norm(Q2*Q2' - I)
Note that since 1 doesn't modify anything it will be close to machine precision or our $epsilon$
In retrospect that was kind of dumb. We're going to create a matrix from the outer product of two other and subtract it from $I_n$
$$A =I_n - vu^t $$
let $vu^T_ij = 0 , vu^T_i=j=1 = epsilon , $
So you have a zero matrix we create we subtract off epsilon from the identity.
$$ | A - I_n | = epsilon $$
$$ | I_n - vu^t -I_n | = | vu^T | = epsilon $$
we can demonstrate this like the following..
n=3;
I =eye(n);
Z = zeros(n);
epsilon = 1e-3;
Z(1,1) = epsilon;
A = I-Z;
error = norm(A-I);
error =
0.0010
So you simply create an $epsilon$ and make it smaller.
edited Aug 6 at 4:08
answered Aug 6 at 1:02
RHowe
1,017815
1,017815
I'm afraid I don't see how I could use this to answer my question. If an orientation-preserving matrix A is really far away from the identity, how does its QR decomposition help me get it as a product of matrices close to the identity?
â Davey
Aug 6 at 18:01
I don't understand your concern. The QR decomposition is gram schmidt. I produced a matrix $Q$ which is orthogonal. I've demonstrated that with both code and math. Also there are two parts..one where I create a diagonal matrix like you state because what you're saying is confused. If you subtract $epsilon $ from identity in the first spot then subtract that from $I_n$ you get $epsilon$ for the norm.
â RHowe
Aug 7 at 0:01
your constraint is to make something called $epsilon$ typically people it is very close it can be any number. further more determinant of orthogonal matrices is 1.
â RHowe
Aug 7 at 0:15
add a comment |Â
I'm afraid I don't see how I could use this to answer my question. If an orientation-preserving matrix A is really far away from the identity, how does its QR decomposition help me get it as a product of matrices close to the identity?
â Davey
Aug 6 at 18:01
I don't understand your concern. The QR decomposition is gram schmidt. I produced a matrix $Q$ which is orthogonal. I've demonstrated that with both code and math. Also there are two parts..one where I create a diagonal matrix like you state because what you're saying is confused. If you subtract $epsilon $ from identity in the first spot then subtract that from $I_n$ you get $epsilon$ for the norm.
â RHowe
Aug 7 at 0:01
your constraint is to make something called $epsilon$ typically people it is very close it can be any number. further more determinant of orthogonal matrices is 1.
â RHowe
Aug 7 at 0:15
I'm afraid I don't see how I could use this to answer my question. If an orientation-preserving matrix A is really far away from the identity, how does its QR decomposition help me get it as a product of matrices close to the identity?
â Davey
Aug 6 at 18:01
I'm afraid I don't see how I could use this to answer my question. If an orientation-preserving matrix A is really far away from the identity, how does its QR decomposition help me get it as a product of matrices close to the identity?
â Davey
Aug 6 at 18:01
I don't understand your concern. The QR decomposition is gram schmidt. I produced a matrix $Q$ which is orthogonal. I've demonstrated that with both code and math. Also there are two parts..one where I create a diagonal matrix like you state because what you're saying is confused. If you subtract $epsilon $ from identity in the first spot then subtract that from $I_n$ you get $epsilon$ for the norm.
â RHowe
Aug 7 at 0:01
I don't understand your concern. The QR decomposition is gram schmidt. I produced a matrix $Q$ which is orthogonal. I've demonstrated that with both code and math. Also there are two parts..one where I create a diagonal matrix like you state because what you're saying is confused. If you subtract $epsilon $ from identity in the first spot then subtract that from $I_n$ you get $epsilon$ for the norm.
â RHowe
Aug 7 at 0:01
your constraint is to make something called $epsilon$ typically people it is very close it can be any number. further more determinant of orthogonal matrices is 1.
â RHowe
Aug 7 at 0:15
your constraint is to make something called $epsilon$ typically people it is very close it can be any number. further more determinant of orthogonal matrices is 1.
â RHowe
Aug 7 at 0:15
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2873424%2fconditions-for-matrix-to-be-product-of-near-identity-matrices%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password