Are orthonormal matrices rotations?
Clash Royale CLAN TAG#URR8PPP
up vote
2
down vote
favorite
If we take a an orthonormal in $mathbbR^2times 2$, we know it has to be of the form
$$A =beginpmatrix a & b\ -b & aendpmatrix$$
such that $$a^2+b^2=1$$
(or the colums could be multiplied by $-1$, but this would make no difference for the following). Since it has these restrictions we can define $vartheta$ such that $a=cosvartheta$, $b=sinvartheta$ and we see that $A$ is a rotation matrix. If the $(-1)$ multiplication is applied, the only difference is that we change the direction of the rotation, but is still a rotation.
I was wondering if this still holds for higher dimensions, i.e. if we have an orthonormal matrix in $mathbbR^ntimes n$ such that it can be written as
$$A = sum_i=1^dleq nR_i,$$
where $R_i$ are rotations around some axis. I am not necessarily interested in the deconstruction itself, only if there is something known about this and if I could read up on this somewhere. Intuitively I would say that this does not exist, or if it exists it will not be of the form as I suggested above, but this still makes it inconclusive for me.
linear-algebra rotations orthonormal
 |Â
show 1 more comment
up vote
2
down vote
favorite
If we take a an orthonormal in $mathbbR^2times 2$, we know it has to be of the form
$$A =beginpmatrix a & b\ -b & aendpmatrix$$
such that $$a^2+b^2=1$$
(or the colums could be multiplied by $-1$, but this would make no difference for the following). Since it has these restrictions we can define $vartheta$ such that $a=cosvartheta$, $b=sinvartheta$ and we see that $A$ is a rotation matrix. If the $(-1)$ multiplication is applied, the only difference is that we change the direction of the rotation, but is still a rotation.
I was wondering if this still holds for higher dimensions, i.e. if we have an orthonormal matrix in $mathbbR^ntimes n$ such that it can be written as
$$A = sum_i=1^dleq nR_i,$$
where $R_i$ are rotations around some axis. I am not necessarily interested in the deconstruction itself, only if there is something known about this and if I could read up on this somewhere. Intuitively I would say that this does not exist, or if it exists it will not be of the form as I suggested above, but this still makes it inconclusive for me.
linear-algebra rotations orthonormal
1
Do you mean $a^2+b^2=1$?
– Arthur
2 days ago
Yes, I will rectify this, thanks :)
– User123456789
2 days ago
with the sum of $R_i$ do you actually mean the composition of the $R_i$, i.e. the product?
– Jonas
2 days ago
See en.wikipedia.org/wiki/Plane_of_rotation#Higher_dimensions
– joriki
2 days ago
1
In $mathbbR^4$ you can choose two orthogonal two dimensional subspaces and perform an ordinary rotation in each. I don't know whether such an orthogonal transformation satisfies your wish for a "construction".
– Ethan Bolker
2 days ago
 |Â
show 1 more comment
up vote
2
down vote
favorite
up vote
2
down vote
favorite
If we take a an orthonormal in $mathbbR^2times 2$, we know it has to be of the form
$$A =beginpmatrix a & b\ -b & aendpmatrix$$
such that $$a^2+b^2=1$$
(or the colums could be multiplied by $-1$, but this would make no difference for the following). Since it has these restrictions we can define $vartheta$ such that $a=cosvartheta$, $b=sinvartheta$ and we see that $A$ is a rotation matrix. If the $(-1)$ multiplication is applied, the only difference is that we change the direction of the rotation, but is still a rotation.
I was wondering if this still holds for higher dimensions, i.e. if we have an orthonormal matrix in $mathbbR^ntimes n$ such that it can be written as
$$A = sum_i=1^dleq nR_i,$$
where $R_i$ are rotations around some axis. I am not necessarily interested in the deconstruction itself, only if there is something known about this and if I could read up on this somewhere. Intuitively I would say that this does not exist, or if it exists it will not be of the form as I suggested above, but this still makes it inconclusive for me.
linear-algebra rotations orthonormal
If we take a an orthonormal in $mathbbR^2times 2$, we know it has to be of the form
$$A =beginpmatrix a & b\ -b & aendpmatrix$$
such that $$a^2+b^2=1$$
(or the colums could be multiplied by $-1$, but this would make no difference for the following). Since it has these restrictions we can define $vartheta$ such that $a=cosvartheta$, $b=sinvartheta$ and we see that $A$ is a rotation matrix. If the $(-1)$ multiplication is applied, the only difference is that we change the direction of the rotation, but is still a rotation.
I was wondering if this still holds for higher dimensions, i.e. if we have an orthonormal matrix in $mathbbR^ntimes n$ such that it can be written as
$$A = sum_i=1^dleq nR_i,$$
where $R_i$ are rotations around some axis. I am not necessarily interested in the deconstruction itself, only if there is something known about this and if I could read up on this somewhere. Intuitively I would say that this does not exist, or if it exists it will not be of the form as I suggested above, but this still makes it inconclusive for me.
linear-algebra rotations orthonormal
edited 2 days ago
asked 2 days ago
User123456789
360213
360213
1
Do you mean $a^2+b^2=1$?
– Arthur
2 days ago
Yes, I will rectify this, thanks :)
– User123456789
2 days ago
with the sum of $R_i$ do you actually mean the composition of the $R_i$, i.e. the product?
– Jonas
2 days ago
See en.wikipedia.org/wiki/Plane_of_rotation#Higher_dimensions
– joriki
2 days ago
1
In $mathbbR^4$ you can choose two orthogonal two dimensional subspaces and perform an ordinary rotation in each. I don't know whether such an orthogonal transformation satisfies your wish for a "construction".
– Ethan Bolker
2 days ago
 |Â
show 1 more comment
1
Do you mean $a^2+b^2=1$?
– Arthur
2 days ago
Yes, I will rectify this, thanks :)
– User123456789
2 days ago
with the sum of $R_i$ do you actually mean the composition of the $R_i$, i.e. the product?
– Jonas
2 days ago
See en.wikipedia.org/wiki/Plane_of_rotation#Higher_dimensions
– joriki
2 days ago
1
In $mathbbR^4$ you can choose two orthogonal two dimensional subspaces and perform an ordinary rotation in each. I don't know whether such an orthogonal transformation satisfies your wish for a "construction".
– Ethan Bolker
2 days ago
1
1
Do you mean $a^2+b^2=1$?
– Arthur
2 days ago
Do you mean $a^2+b^2=1$?
– Arthur
2 days ago
Yes, I will rectify this, thanks :)
– User123456789
2 days ago
Yes, I will rectify this, thanks :)
– User123456789
2 days ago
with the sum of $R_i$ do you actually mean the composition of the $R_i$, i.e. the product?
– Jonas
2 days ago
with the sum of $R_i$ do you actually mean the composition of the $R_i$, i.e. the product?
– Jonas
2 days ago
See en.wikipedia.org/wiki/Plane_of_rotation#Higher_dimensions
– joriki
2 days ago
See en.wikipedia.org/wiki/Plane_of_rotation#Higher_dimensions
– joriki
2 days ago
1
1
In $mathbbR^4$ you can choose two orthogonal two dimensional subspaces and perform an ordinary rotation in each. I don't know whether such an orthogonal transformation satisfies your wish for a "construction".
– Ethan Bolker
2 days ago
In $mathbbR^4$ you can choose two orthogonal two dimensional subspaces and perform an ordinary rotation in each. I don't know whether such an orthogonal transformation satisfies your wish for a "construction".
– Ethan Bolker
2 days ago
 |Â
show 1 more comment
2 Answers
2
active
oldest
votes
up vote
2
down vote
accepted
In $n$-dimensional case, it could be shown that such orthogonal matrices $boldsymbol A$ are similar to a block-diagonal matrices
$$
beginbmatrix
boldsymbol R_1 & & & & & \
& boldsymbol R_2 &&&&\
&& boldsymbol R_3 &&& \
&&& ddots &&\
&&&&boldsymbol R_k &\
&&&&& boldsymbol I_n-2k
endbmatrix
$$
when $det(boldsymbol A) =1$, or
$$beginbmatrix
boldsymbol R_1 & & & & & \
& boldsymbol R_2 &&&&\
&& ddots &&& \
&&&boldsymbol R_k &&\
&&&& boldsymbol I_n-2k-1 & \
&&&&& -1 endbmatrix
$$
when $det(boldsymbol A) =-1$. Here
$$
boldsymbol R_j =
beginbmatrix
cos(varphi_j) & -sin (varphi_j)\ sin(varphi_j) & cos(varphi_j)
endbmatrix quad [j = 1, ldots, k],
$$
and $boldsymbol I_m$ is an $m times m $ identity matrix.
Hence such a decomposition exists.
Reference: Linear Algebra Done Wrong. Sergei Treil [Available online]
From the shape of $A$, it seems that this does hold for some orthonormal matrices, but not for any $A$, or am I mistaken?
– User123456789
2 days ago
1
These holds for a class of similar matrices. If you want to decompose the general matrix, then you could simply change the coordinates. However, after that the rotations might be deformed.
– xbh
2 days ago
Okay thanks! For my idea I may do whatever I want to the matrix, as long as I can keep track of the specific operations, so this then helps a lot.
– User123456789
2 days ago
add a comment |Â
up vote
1
down vote
Since an orthogonal matrix is normal, it is diagonalizable over $mathbb C$. Since it is unitary, its eigenvalues have magnitude $1$. Since its characteristic polynomial is real, its eigenvalues come in complex conjugate pairs. If you order the eigenvalues such that the pairs are consecutive, the diagonal blocks
$$
pmatrixmathrm e^mathrm iphi&0\0&mathrm e^-mathrm iphi
$$
can be transformed to
$$
pmatrixcosphi&-sinphi\sinphi&cosphi;.
$$
Thus, an orthogonal transformation can be written as the product (not sum) of reflections and rotations in planes. In three dimensions, specifying a plane of rotation and a rotation axis is equivalent, but only the specification by a plane generalizes to higher dimensions.
An eigenvector with eigenvalue $1$ is invariant under the transformation; an eigenvector with eigenvalue $-1$ is reflected by the transformation; and each pair of eigenvectors with complex conjugate eigenvalues spans a plane of rotation.
add a comment |Â
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
2
down vote
accepted
In $n$-dimensional case, it could be shown that such orthogonal matrices $boldsymbol A$ are similar to a block-diagonal matrices
$$
beginbmatrix
boldsymbol R_1 & & & & & \
& boldsymbol R_2 &&&&\
&& boldsymbol R_3 &&& \
&&& ddots &&\
&&&&boldsymbol R_k &\
&&&&& boldsymbol I_n-2k
endbmatrix
$$
when $det(boldsymbol A) =1$, or
$$beginbmatrix
boldsymbol R_1 & & & & & \
& boldsymbol R_2 &&&&\
&& ddots &&& \
&&&boldsymbol R_k &&\
&&&& boldsymbol I_n-2k-1 & \
&&&&& -1 endbmatrix
$$
when $det(boldsymbol A) =-1$. Here
$$
boldsymbol R_j =
beginbmatrix
cos(varphi_j) & -sin (varphi_j)\ sin(varphi_j) & cos(varphi_j)
endbmatrix quad [j = 1, ldots, k],
$$
and $boldsymbol I_m$ is an $m times m $ identity matrix.
Hence such a decomposition exists.
Reference: Linear Algebra Done Wrong. Sergei Treil [Available online]
From the shape of $A$, it seems that this does hold for some orthonormal matrices, but not for any $A$, or am I mistaken?
– User123456789
2 days ago
1
These holds for a class of similar matrices. If you want to decompose the general matrix, then you could simply change the coordinates. However, after that the rotations might be deformed.
– xbh
2 days ago
Okay thanks! For my idea I may do whatever I want to the matrix, as long as I can keep track of the specific operations, so this then helps a lot.
– User123456789
2 days ago
add a comment |Â
up vote
2
down vote
accepted
In $n$-dimensional case, it could be shown that such orthogonal matrices $boldsymbol A$ are similar to a block-diagonal matrices
$$
beginbmatrix
boldsymbol R_1 & & & & & \
& boldsymbol R_2 &&&&\
&& boldsymbol R_3 &&& \
&&& ddots &&\
&&&&boldsymbol R_k &\
&&&&& boldsymbol I_n-2k
endbmatrix
$$
when $det(boldsymbol A) =1$, or
$$beginbmatrix
boldsymbol R_1 & & & & & \
& boldsymbol R_2 &&&&\
&& ddots &&& \
&&&boldsymbol R_k &&\
&&&& boldsymbol I_n-2k-1 & \
&&&&& -1 endbmatrix
$$
when $det(boldsymbol A) =-1$. Here
$$
boldsymbol R_j =
beginbmatrix
cos(varphi_j) & -sin (varphi_j)\ sin(varphi_j) & cos(varphi_j)
endbmatrix quad [j = 1, ldots, k],
$$
and $boldsymbol I_m$ is an $m times m $ identity matrix.
Hence such a decomposition exists.
Reference: Linear Algebra Done Wrong. Sergei Treil [Available online]
From the shape of $A$, it seems that this does hold for some orthonormal matrices, but not for any $A$, or am I mistaken?
– User123456789
2 days ago
1
These holds for a class of similar matrices. If you want to decompose the general matrix, then you could simply change the coordinates. However, after that the rotations might be deformed.
– xbh
2 days ago
Okay thanks! For my idea I may do whatever I want to the matrix, as long as I can keep track of the specific operations, so this then helps a lot.
– User123456789
2 days ago
add a comment |Â
up vote
2
down vote
accepted
up vote
2
down vote
accepted
In $n$-dimensional case, it could be shown that such orthogonal matrices $boldsymbol A$ are similar to a block-diagonal matrices
$$
beginbmatrix
boldsymbol R_1 & & & & & \
& boldsymbol R_2 &&&&\
&& boldsymbol R_3 &&& \
&&& ddots &&\
&&&&boldsymbol R_k &\
&&&&& boldsymbol I_n-2k
endbmatrix
$$
when $det(boldsymbol A) =1$, or
$$beginbmatrix
boldsymbol R_1 & & & & & \
& boldsymbol R_2 &&&&\
&& ddots &&& \
&&&boldsymbol R_k &&\
&&&& boldsymbol I_n-2k-1 & \
&&&&& -1 endbmatrix
$$
when $det(boldsymbol A) =-1$. Here
$$
boldsymbol R_j =
beginbmatrix
cos(varphi_j) & -sin (varphi_j)\ sin(varphi_j) & cos(varphi_j)
endbmatrix quad [j = 1, ldots, k],
$$
and $boldsymbol I_m$ is an $m times m $ identity matrix.
Hence such a decomposition exists.
Reference: Linear Algebra Done Wrong. Sergei Treil [Available online]
In $n$-dimensional case, it could be shown that such orthogonal matrices $boldsymbol A$ are similar to a block-diagonal matrices
$$
beginbmatrix
boldsymbol R_1 & & & & & \
& boldsymbol R_2 &&&&\
&& boldsymbol R_3 &&& \
&&& ddots &&\
&&&&boldsymbol R_k &\
&&&&& boldsymbol I_n-2k
endbmatrix
$$
when $det(boldsymbol A) =1$, or
$$beginbmatrix
boldsymbol R_1 & & & & & \
& boldsymbol R_2 &&&&\
&& ddots &&& \
&&&boldsymbol R_k &&\
&&&& boldsymbol I_n-2k-1 & \
&&&&& -1 endbmatrix
$$
when $det(boldsymbol A) =-1$. Here
$$
boldsymbol R_j =
beginbmatrix
cos(varphi_j) & -sin (varphi_j)\ sin(varphi_j) & cos(varphi_j)
endbmatrix quad [j = 1, ldots, k],
$$
and $boldsymbol I_m$ is an $m times m $ identity matrix.
Hence such a decomposition exists.
Reference: Linear Algebra Done Wrong. Sergei Treil [Available online]
answered 2 days ago
xbh
9156
9156
From the shape of $A$, it seems that this does hold for some orthonormal matrices, but not for any $A$, or am I mistaken?
– User123456789
2 days ago
1
These holds for a class of similar matrices. If you want to decompose the general matrix, then you could simply change the coordinates. However, after that the rotations might be deformed.
– xbh
2 days ago
Okay thanks! For my idea I may do whatever I want to the matrix, as long as I can keep track of the specific operations, so this then helps a lot.
– User123456789
2 days ago
add a comment |Â
From the shape of $A$, it seems that this does hold for some orthonormal matrices, but not for any $A$, or am I mistaken?
– User123456789
2 days ago
1
These holds for a class of similar matrices. If you want to decompose the general matrix, then you could simply change the coordinates. However, after that the rotations might be deformed.
– xbh
2 days ago
Okay thanks! For my idea I may do whatever I want to the matrix, as long as I can keep track of the specific operations, so this then helps a lot.
– User123456789
2 days ago
From the shape of $A$, it seems that this does hold for some orthonormal matrices, but not for any $A$, or am I mistaken?
– User123456789
2 days ago
From the shape of $A$, it seems that this does hold for some orthonormal matrices, but not for any $A$, or am I mistaken?
– User123456789
2 days ago
1
1
These holds for a class of similar matrices. If you want to decompose the general matrix, then you could simply change the coordinates. However, after that the rotations might be deformed.
– xbh
2 days ago
These holds for a class of similar matrices. If you want to decompose the general matrix, then you could simply change the coordinates. However, after that the rotations might be deformed.
– xbh
2 days ago
Okay thanks! For my idea I may do whatever I want to the matrix, as long as I can keep track of the specific operations, so this then helps a lot.
– User123456789
2 days ago
Okay thanks! For my idea I may do whatever I want to the matrix, as long as I can keep track of the specific operations, so this then helps a lot.
– User123456789
2 days ago
add a comment |Â
up vote
1
down vote
Since an orthogonal matrix is normal, it is diagonalizable over $mathbb C$. Since it is unitary, its eigenvalues have magnitude $1$. Since its characteristic polynomial is real, its eigenvalues come in complex conjugate pairs. If you order the eigenvalues such that the pairs are consecutive, the diagonal blocks
$$
pmatrixmathrm e^mathrm iphi&0\0&mathrm e^-mathrm iphi
$$
can be transformed to
$$
pmatrixcosphi&-sinphi\sinphi&cosphi;.
$$
Thus, an orthogonal transformation can be written as the product (not sum) of reflections and rotations in planes. In three dimensions, specifying a plane of rotation and a rotation axis is equivalent, but only the specification by a plane generalizes to higher dimensions.
An eigenvector with eigenvalue $1$ is invariant under the transformation; an eigenvector with eigenvalue $-1$ is reflected by the transformation; and each pair of eigenvectors with complex conjugate eigenvalues spans a plane of rotation.
add a comment |Â
up vote
1
down vote
Since an orthogonal matrix is normal, it is diagonalizable over $mathbb C$. Since it is unitary, its eigenvalues have magnitude $1$. Since its characteristic polynomial is real, its eigenvalues come in complex conjugate pairs. If you order the eigenvalues such that the pairs are consecutive, the diagonal blocks
$$
pmatrixmathrm e^mathrm iphi&0\0&mathrm e^-mathrm iphi
$$
can be transformed to
$$
pmatrixcosphi&-sinphi\sinphi&cosphi;.
$$
Thus, an orthogonal transformation can be written as the product (not sum) of reflections and rotations in planes. In three dimensions, specifying a plane of rotation and a rotation axis is equivalent, but only the specification by a plane generalizes to higher dimensions.
An eigenvector with eigenvalue $1$ is invariant under the transformation; an eigenvector with eigenvalue $-1$ is reflected by the transformation; and each pair of eigenvectors with complex conjugate eigenvalues spans a plane of rotation.
add a comment |Â
up vote
1
down vote
up vote
1
down vote
Since an orthogonal matrix is normal, it is diagonalizable over $mathbb C$. Since it is unitary, its eigenvalues have magnitude $1$. Since its characteristic polynomial is real, its eigenvalues come in complex conjugate pairs. If you order the eigenvalues such that the pairs are consecutive, the diagonal blocks
$$
pmatrixmathrm e^mathrm iphi&0\0&mathrm e^-mathrm iphi
$$
can be transformed to
$$
pmatrixcosphi&-sinphi\sinphi&cosphi;.
$$
Thus, an orthogonal transformation can be written as the product (not sum) of reflections and rotations in planes. In three dimensions, specifying a plane of rotation and a rotation axis is equivalent, but only the specification by a plane generalizes to higher dimensions.
An eigenvector with eigenvalue $1$ is invariant under the transformation; an eigenvector with eigenvalue $-1$ is reflected by the transformation; and each pair of eigenvectors with complex conjugate eigenvalues spans a plane of rotation.
Since an orthogonal matrix is normal, it is diagonalizable over $mathbb C$. Since it is unitary, its eigenvalues have magnitude $1$. Since its characteristic polynomial is real, its eigenvalues come in complex conjugate pairs. If you order the eigenvalues such that the pairs are consecutive, the diagonal blocks
$$
pmatrixmathrm e^mathrm iphi&0\0&mathrm e^-mathrm iphi
$$
can be transformed to
$$
pmatrixcosphi&-sinphi\sinphi&cosphi;.
$$
Thus, an orthogonal transformation can be written as the product (not sum) of reflections and rotations in planes. In three dimensions, specifying a plane of rotation and a rotation axis is equivalent, but only the specification by a plane generalizes to higher dimensions.
An eigenvector with eigenvalue $1$ is invariant under the transformation; an eigenvector with eigenvalue $-1$ is reflected by the transformation; and each pair of eigenvectors with complex conjugate eigenvalues spans a plane of rotation.
answered 2 days ago
joriki
164k10179328
164k10179328
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2872052%2fare-orthonormal-matrices-rotations%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
1
Do you mean $a^2+b^2=1$?
– Arthur
2 days ago
Yes, I will rectify this, thanks :)
– User123456789
2 days ago
with the sum of $R_i$ do you actually mean the composition of the $R_i$, i.e. the product?
– Jonas
2 days ago
See en.wikipedia.org/wiki/Plane_of_rotation#Higher_dimensions
– joriki
2 days ago
1
In $mathbbR^4$ you can choose two orthogonal two dimensional subspaces and perform an ordinary rotation in each. I don't know whether such an orthogonal transformation satisfies your wish for a "construction".
– Ethan Bolker
2 days ago