Are orthonormal matrices rotations?

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
2
down vote

favorite












If we take a an orthonormal in $mathbbR^2times 2$, we know it has to be of the form



$$A =beginpmatrix a & b\ -b & aendpmatrix$$
such that $$a^2+b^2=1$$



(or the colums could be multiplied by $-1$, but this would make no difference for the following). Since it has these restrictions we can define $vartheta$ such that $a=cosvartheta$, $b=sinvartheta$ and we see that $A$ is a rotation matrix. If the $(-1)$ multiplication is applied, the only difference is that we change the direction of the rotation, but is still a rotation.



I was wondering if this still holds for higher dimensions, i.e. if we have an orthonormal matrix in $mathbbR^ntimes n$ such that it can be written as
$$A = sum_i=1^dleq nR_i,$$
where $R_i$ are rotations around some axis. I am not necessarily interested in the deconstruction itself, only if there is something known about this and if I could read up on this somewhere. Intuitively I would say that this does not exist, or if it exists it will not be of the form as I suggested above, but this still makes it inconclusive for me.







share|cite|improve this question

















  • 1




    Do you mean $a^2+b^2=1$?
    – Arthur
    2 days ago










  • Yes, I will rectify this, thanks :)
    – User123456789
    2 days ago










  • with the sum of $R_i$ do you actually mean the composition of the $R_i$, i.e. the product?
    – Jonas
    2 days ago










  • See en.wikipedia.org/wiki/Plane_of_rotation#Higher_dimensions
    – joriki
    2 days ago






  • 1




    In $mathbbR^4$ you can choose two orthogonal two dimensional subspaces and perform an ordinary rotation in each. I don't know whether such an orthogonal transformation satisfies your wish for a "construction".
    – Ethan Bolker
    2 days ago














up vote
2
down vote

favorite












If we take a an orthonormal in $mathbbR^2times 2$, we know it has to be of the form



$$A =beginpmatrix a & b\ -b & aendpmatrix$$
such that $$a^2+b^2=1$$



(or the colums could be multiplied by $-1$, but this would make no difference for the following). Since it has these restrictions we can define $vartheta$ such that $a=cosvartheta$, $b=sinvartheta$ and we see that $A$ is a rotation matrix. If the $(-1)$ multiplication is applied, the only difference is that we change the direction of the rotation, but is still a rotation.



I was wondering if this still holds for higher dimensions, i.e. if we have an orthonormal matrix in $mathbbR^ntimes n$ such that it can be written as
$$A = sum_i=1^dleq nR_i,$$
where $R_i$ are rotations around some axis. I am not necessarily interested in the deconstruction itself, only if there is something known about this and if I could read up on this somewhere. Intuitively I would say that this does not exist, or if it exists it will not be of the form as I suggested above, but this still makes it inconclusive for me.







share|cite|improve this question

















  • 1




    Do you mean $a^2+b^2=1$?
    – Arthur
    2 days ago










  • Yes, I will rectify this, thanks :)
    – User123456789
    2 days ago










  • with the sum of $R_i$ do you actually mean the composition of the $R_i$, i.e. the product?
    – Jonas
    2 days ago










  • See en.wikipedia.org/wiki/Plane_of_rotation#Higher_dimensions
    – joriki
    2 days ago






  • 1




    In $mathbbR^4$ you can choose two orthogonal two dimensional subspaces and perform an ordinary rotation in each. I don't know whether such an orthogonal transformation satisfies your wish for a "construction".
    – Ethan Bolker
    2 days ago












up vote
2
down vote

favorite









up vote
2
down vote

favorite











If we take a an orthonormal in $mathbbR^2times 2$, we know it has to be of the form



$$A =beginpmatrix a & b\ -b & aendpmatrix$$
such that $$a^2+b^2=1$$



(or the colums could be multiplied by $-1$, but this would make no difference for the following). Since it has these restrictions we can define $vartheta$ such that $a=cosvartheta$, $b=sinvartheta$ and we see that $A$ is a rotation matrix. If the $(-1)$ multiplication is applied, the only difference is that we change the direction of the rotation, but is still a rotation.



I was wondering if this still holds for higher dimensions, i.e. if we have an orthonormal matrix in $mathbbR^ntimes n$ such that it can be written as
$$A = sum_i=1^dleq nR_i,$$
where $R_i$ are rotations around some axis. I am not necessarily interested in the deconstruction itself, only if there is something known about this and if I could read up on this somewhere. Intuitively I would say that this does not exist, or if it exists it will not be of the form as I suggested above, but this still makes it inconclusive for me.







share|cite|improve this question













If we take a an orthonormal in $mathbbR^2times 2$, we know it has to be of the form



$$A =beginpmatrix a & b\ -b & aendpmatrix$$
such that $$a^2+b^2=1$$



(or the colums could be multiplied by $-1$, but this would make no difference for the following). Since it has these restrictions we can define $vartheta$ such that $a=cosvartheta$, $b=sinvartheta$ and we see that $A$ is a rotation matrix. If the $(-1)$ multiplication is applied, the only difference is that we change the direction of the rotation, but is still a rotation.



I was wondering if this still holds for higher dimensions, i.e. if we have an orthonormal matrix in $mathbbR^ntimes n$ such that it can be written as
$$A = sum_i=1^dleq nR_i,$$
where $R_i$ are rotations around some axis. I am not necessarily interested in the deconstruction itself, only if there is something known about this and if I could read up on this somewhere. Intuitively I would say that this does not exist, or if it exists it will not be of the form as I suggested above, but this still makes it inconclusive for me.









share|cite|improve this question












share|cite|improve this question




share|cite|improve this question








edited 2 days ago
























asked 2 days ago









User123456789

360213




360213







  • 1




    Do you mean $a^2+b^2=1$?
    – Arthur
    2 days ago










  • Yes, I will rectify this, thanks :)
    – User123456789
    2 days ago










  • with the sum of $R_i$ do you actually mean the composition of the $R_i$, i.e. the product?
    – Jonas
    2 days ago










  • See en.wikipedia.org/wiki/Plane_of_rotation#Higher_dimensions
    – joriki
    2 days ago






  • 1




    In $mathbbR^4$ you can choose two orthogonal two dimensional subspaces and perform an ordinary rotation in each. I don't know whether such an orthogonal transformation satisfies your wish for a "construction".
    – Ethan Bolker
    2 days ago












  • 1




    Do you mean $a^2+b^2=1$?
    – Arthur
    2 days ago










  • Yes, I will rectify this, thanks :)
    – User123456789
    2 days ago










  • with the sum of $R_i$ do you actually mean the composition of the $R_i$, i.e. the product?
    – Jonas
    2 days ago










  • See en.wikipedia.org/wiki/Plane_of_rotation#Higher_dimensions
    – joriki
    2 days ago






  • 1




    In $mathbbR^4$ you can choose two orthogonal two dimensional subspaces and perform an ordinary rotation in each. I don't know whether such an orthogonal transformation satisfies your wish for a "construction".
    – Ethan Bolker
    2 days ago







1




1




Do you mean $a^2+b^2=1$?
– Arthur
2 days ago




Do you mean $a^2+b^2=1$?
– Arthur
2 days ago












Yes, I will rectify this, thanks :)
– User123456789
2 days ago




Yes, I will rectify this, thanks :)
– User123456789
2 days ago












with the sum of $R_i$ do you actually mean the composition of the $R_i$, i.e. the product?
– Jonas
2 days ago




with the sum of $R_i$ do you actually mean the composition of the $R_i$, i.e. the product?
– Jonas
2 days ago












See en.wikipedia.org/wiki/Plane_of_rotation#Higher_dimensions
– joriki
2 days ago




See en.wikipedia.org/wiki/Plane_of_rotation#Higher_dimensions
– joriki
2 days ago




1




1




In $mathbbR^4$ you can choose two orthogonal two dimensional subspaces and perform an ordinary rotation in each. I don't know whether such an orthogonal transformation satisfies your wish for a "construction".
– Ethan Bolker
2 days ago




In $mathbbR^4$ you can choose two orthogonal two dimensional subspaces and perform an ordinary rotation in each. I don't know whether such an orthogonal transformation satisfies your wish for a "construction".
– Ethan Bolker
2 days ago










2 Answers
2






active

oldest

votes

















up vote
2
down vote



accepted










In $n$-dimensional case, it could be shown that such orthogonal matrices $boldsymbol A$ are similar to a block-diagonal matrices
$$
beginbmatrix
boldsymbol R_1 & & & & & \
& boldsymbol R_2 &&&&\
&& boldsymbol R_3 &&& \
&&& ddots &&\
&&&&boldsymbol R_k &\
&&&&& boldsymbol I_n-2k
endbmatrix
$$
when $det(boldsymbol A) =1$, or
$$beginbmatrix
boldsymbol R_1 & & & & & \
& boldsymbol R_2 &&&&\
&& ddots &&& \
&&&boldsymbol R_k &&\
&&&& boldsymbol I_n-2k-1 & \
&&&&& -1 endbmatrix
$$
when $det(boldsymbol A) =-1$. Here
$$
boldsymbol R_j =
beginbmatrix
cos(varphi_j) & -sin (varphi_j)\ sin(varphi_j) & cos(varphi_j)
endbmatrix quad [j = 1, ldots, k],
$$
and $boldsymbol I_m$ is an $m times m $ identity matrix.



Hence such a decomposition exists.



Reference: Linear Algebra Done Wrong. Sergei Treil [Available online]






share|cite|improve this answer





















  • From the shape of $A$, it seems that this does hold for some orthonormal matrices, but not for any $A$, or am I mistaken?
    – User123456789
    2 days ago







  • 1




    These holds for a class of similar matrices. If you want to decompose the general matrix, then you could simply change the coordinates. However, after that the rotations might be deformed.
    – xbh
    2 days ago










  • Okay thanks! For my idea I may do whatever I want to the matrix, as long as I can keep track of the specific operations, so this then helps a lot.
    – User123456789
    2 days ago

















up vote
1
down vote













Since an orthogonal matrix is normal, it is diagonalizable over $mathbb C$. Since it is unitary, its eigenvalues have magnitude $1$. Since its characteristic polynomial is real, its eigenvalues come in complex conjugate pairs. If you order the eigenvalues such that the pairs are consecutive, the diagonal blocks



$$
pmatrixmathrm e^mathrm iphi&0\0&mathrm e^-mathrm iphi
$$



can be transformed to



$$
pmatrixcosphi&-sinphi\sinphi&cosphi;.
$$



Thus, an orthogonal transformation can be written as the product (not sum) of reflections and rotations in planes. In three dimensions, specifying a plane of rotation and a rotation axis is equivalent, but only the specification by a plane generalizes to higher dimensions.



An eigenvector with eigenvalue $1$ is invariant under the transformation; an eigenvector with eigenvalue $-1$ is reflected by the transformation; and each pair of eigenvectors with complex conjugate eigenvalues spans a plane of rotation.






share|cite|improve this answer





















    Your Answer




    StackExchange.ifUsing("editor", function ()
    return StackExchange.using("mathjaxEditing", function ()
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    );
    );
    , "mathjax-editing");

    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "69"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: false,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );








     

    draft saved


    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2872052%2fare-orthonormal-matrices-rotations%23new-answer', 'question_page');

    );

    Post as a guest






























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    2
    down vote



    accepted










    In $n$-dimensional case, it could be shown that such orthogonal matrices $boldsymbol A$ are similar to a block-diagonal matrices
    $$
    beginbmatrix
    boldsymbol R_1 & & & & & \
    & boldsymbol R_2 &&&&\
    && boldsymbol R_3 &&& \
    &&& ddots &&\
    &&&&boldsymbol R_k &\
    &&&&& boldsymbol I_n-2k
    endbmatrix
    $$
    when $det(boldsymbol A) =1$, or
    $$beginbmatrix
    boldsymbol R_1 & & & & & \
    & boldsymbol R_2 &&&&\
    && ddots &&& \
    &&&boldsymbol R_k &&\
    &&&& boldsymbol I_n-2k-1 & \
    &&&&& -1 endbmatrix
    $$
    when $det(boldsymbol A) =-1$. Here
    $$
    boldsymbol R_j =
    beginbmatrix
    cos(varphi_j) & -sin (varphi_j)\ sin(varphi_j) & cos(varphi_j)
    endbmatrix quad [j = 1, ldots, k],
    $$
    and $boldsymbol I_m$ is an $m times m $ identity matrix.



    Hence such a decomposition exists.



    Reference: Linear Algebra Done Wrong. Sergei Treil [Available online]






    share|cite|improve this answer





















    • From the shape of $A$, it seems that this does hold for some orthonormal matrices, but not for any $A$, or am I mistaken?
      – User123456789
      2 days ago







    • 1




      These holds for a class of similar matrices. If you want to decompose the general matrix, then you could simply change the coordinates. However, after that the rotations might be deformed.
      – xbh
      2 days ago










    • Okay thanks! For my idea I may do whatever I want to the matrix, as long as I can keep track of the specific operations, so this then helps a lot.
      – User123456789
      2 days ago














    up vote
    2
    down vote



    accepted










    In $n$-dimensional case, it could be shown that such orthogonal matrices $boldsymbol A$ are similar to a block-diagonal matrices
    $$
    beginbmatrix
    boldsymbol R_1 & & & & & \
    & boldsymbol R_2 &&&&\
    && boldsymbol R_3 &&& \
    &&& ddots &&\
    &&&&boldsymbol R_k &\
    &&&&& boldsymbol I_n-2k
    endbmatrix
    $$
    when $det(boldsymbol A) =1$, or
    $$beginbmatrix
    boldsymbol R_1 & & & & & \
    & boldsymbol R_2 &&&&\
    && ddots &&& \
    &&&boldsymbol R_k &&\
    &&&& boldsymbol I_n-2k-1 & \
    &&&&& -1 endbmatrix
    $$
    when $det(boldsymbol A) =-1$. Here
    $$
    boldsymbol R_j =
    beginbmatrix
    cos(varphi_j) & -sin (varphi_j)\ sin(varphi_j) & cos(varphi_j)
    endbmatrix quad [j = 1, ldots, k],
    $$
    and $boldsymbol I_m$ is an $m times m $ identity matrix.



    Hence such a decomposition exists.



    Reference: Linear Algebra Done Wrong. Sergei Treil [Available online]






    share|cite|improve this answer





















    • From the shape of $A$, it seems that this does hold for some orthonormal matrices, but not for any $A$, or am I mistaken?
      – User123456789
      2 days ago







    • 1




      These holds for a class of similar matrices. If you want to decompose the general matrix, then you could simply change the coordinates. However, after that the rotations might be deformed.
      – xbh
      2 days ago










    • Okay thanks! For my idea I may do whatever I want to the matrix, as long as I can keep track of the specific operations, so this then helps a lot.
      – User123456789
      2 days ago












    up vote
    2
    down vote



    accepted







    up vote
    2
    down vote



    accepted






    In $n$-dimensional case, it could be shown that such orthogonal matrices $boldsymbol A$ are similar to a block-diagonal matrices
    $$
    beginbmatrix
    boldsymbol R_1 & & & & & \
    & boldsymbol R_2 &&&&\
    && boldsymbol R_3 &&& \
    &&& ddots &&\
    &&&&boldsymbol R_k &\
    &&&&& boldsymbol I_n-2k
    endbmatrix
    $$
    when $det(boldsymbol A) =1$, or
    $$beginbmatrix
    boldsymbol R_1 & & & & & \
    & boldsymbol R_2 &&&&\
    && ddots &&& \
    &&&boldsymbol R_k &&\
    &&&& boldsymbol I_n-2k-1 & \
    &&&&& -1 endbmatrix
    $$
    when $det(boldsymbol A) =-1$. Here
    $$
    boldsymbol R_j =
    beginbmatrix
    cos(varphi_j) & -sin (varphi_j)\ sin(varphi_j) & cos(varphi_j)
    endbmatrix quad [j = 1, ldots, k],
    $$
    and $boldsymbol I_m$ is an $m times m $ identity matrix.



    Hence such a decomposition exists.



    Reference: Linear Algebra Done Wrong. Sergei Treil [Available online]






    share|cite|improve this answer













    In $n$-dimensional case, it could be shown that such orthogonal matrices $boldsymbol A$ are similar to a block-diagonal matrices
    $$
    beginbmatrix
    boldsymbol R_1 & & & & & \
    & boldsymbol R_2 &&&&\
    && boldsymbol R_3 &&& \
    &&& ddots &&\
    &&&&boldsymbol R_k &\
    &&&&& boldsymbol I_n-2k
    endbmatrix
    $$
    when $det(boldsymbol A) =1$, or
    $$beginbmatrix
    boldsymbol R_1 & & & & & \
    & boldsymbol R_2 &&&&\
    && ddots &&& \
    &&&boldsymbol R_k &&\
    &&&& boldsymbol I_n-2k-1 & \
    &&&&& -1 endbmatrix
    $$
    when $det(boldsymbol A) =-1$. Here
    $$
    boldsymbol R_j =
    beginbmatrix
    cos(varphi_j) & -sin (varphi_j)\ sin(varphi_j) & cos(varphi_j)
    endbmatrix quad [j = 1, ldots, k],
    $$
    and $boldsymbol I_m$ is an $m times m $ identity matrix.



    Hence such a decomposition exists.



    Reference: Linear Algebra Done Wrong. Sergei Treil [Available online]







    share|cite|improve this answer













    share|cite|improve this answer



    share|cite|improve this answer











    answered 2 days ago









    xbh

    9156




    9156











    • From the shape of $A$, it seems that this does hold for some orthonormal matrices, but not for any $A$, or am I mistaken?
      – User123456789
      2 days ago







    • 1




      These holds for a class of similar matrices. If you want to decompose the general matrix, then you could simply change the coordinates. However, after that the rotations might be deformed.
      – xbh
      2 days ago










    • Okay thanks! For my idea I may do whatever I want to the matrix, as long as I can keep track of the specific operations, so this then helps a lot.
      – User123456789
      2 days ago
















    • From the shape of $A$, it seems that this does hold for some orthonormal matrices, but not for any $A$, or am I mistaken?
      – User123456789
      2 days ago







    • 1




      These holds for a class of similar matrices. If you want to decompose the general matrix, then you could simply change the coordinates. However, after that the rotations might be deformed.
      – xbh
      2 days ago










    • Okay thanks! For my idea I may do whatever I want to the matrix, as long as I can keep track of the specific operations, so this then helps a lot.
      – User123456789
      2 days ago















    From the shape of $A$, it seems that this does hold for some orthonormal matrices, but not for any $A$, or am I mistaken?
    – User123456789
    2 days ago





    From the shape of $A$, it seems that this does hold for some orthonormal matrices, but not for any $A$, or am I mistaken?
    – User123456789
    2 days ago





    1




    1




    These holds for a class of similar matrices. If you want to decompose the general matrix, then you could simply change the coordinates. However, after that the rotations might be deformed.
    – xbh
    2 days ago




    These holds for a class of similar matrices. If you want to decompose the general matrix, then you could simply change the coordinates. However, after that the rotations might be deformed.
    – xbh
    2 days ago












    Okay thanks! For my idea I may do whatever I want to the matrix, as long as I can keep track of the specific operations, so this then helps a lot.
    – User123456789
    2 days ago




    Okay thanks! For my idea I may do whatever I want to the matrix, as long as I can keep track of the specific operations, so this then helps a lot.
    – User123456789
    2 days ago










    up vote
    1
    down vote













    Since an orthogonal matrix is normal, it is diagonalizable over $mathbb C$. Since it is unitary, its eigenvalues have magnitude $1$. Since its characteristic polynomial is real, its eigenvalues come in complex conjugate pairs. If you order the eigenvalues such that the pairs are consecutive, the diagonal blocks



    $$
    pmatrixmathrm e^mathrm iphi&0\0&mathrm e^-mathrm iphi
    $$



    can be transformed to



    $$
    pmatrixcosphi&-sinphi\sinphi&cosphi;.
    $$



    Thus, an orthogonal transformation can be written as the product (not sum) of reflections and rotations in planes. In three dimensions, specifying a plane of rotation and a rotation axis is equivalent, but only the specification by a plane generalizes to higher dimensions.



    An eigenvector with eigenvalue $1$ is invariant under the transformation; an eigenvector with eigenvalue $-1$ is reflected by the transformation; and each pair of eigenvectors with complex conjugate eigenvalues spans a plane of rotation.






    share|cite|improve this answer

























      up vote
      1
      down vote













      Since an orthogonal matrix is normal, it is diagonalizable over $mathbb C$. Since it is unitary, its eigenvalues have magnitude $1$. Since its characteristic polynomial is real, its eigenvalues come in complex conjugate pairs. If you order the eigenvalues such that the pairs are consecutive, the diagonal blocks



      $$
      pmatrixmathrm e^mathrm iphi&0\0&mathrm e^-mathrm iphi
      $$



      can be transformed to



      $$
      pmatrixcosphi&-sinphi\sinphi&cosphi;.
      $$



      Thus, an orthogonal transformation can be written as the product (not sum) of reflections and rotations in planes. In three dimensions, specifying a plane of rotation and a rotation axis is equivalent, but only the specification by a plane generalizes to higher dimensions.



      An eigenvector with eigenvalue $1$ is invariant under the transformation; an eigenvector with eigenvalue $-1$ is reflected by the transformation; and each pair of eigenvectors with complex conjugate eigenvalues spans a plane of rotation.






      share|cite|improve this answer























        up vote
        1
        down vote










        up vote
        1
        down vote









        Since an orthogonal matrix is normal, it is diagonalizable over $mathbb C$. Since it is unitary, its eigenvalues have magnitude $1$. Since its characteristic polynomial is real, its eigenvalues come in complex conjugate pairs. If you order the eigenvalues such that the pairs are consecutive, the diagonal blocks



        $$
        pmatrixmathrm e^mathrm iphi&0\0&mathrm e^-mathrm iphi
        $$



        can be transformed to



        $$
        pmatrixcosphi&-sinphi\sinphi&cosphi;.
        $$



        Thus, an orthogonal transformation can be written as the product (not sum) of reflections and rotations in planes. In three dimensions, specifying a plane of rotation and a rotation axis is equivalent, but only the specification by a plane generalizes to higher dimensions.



        An eigenvector with eigenvalue $1$ is invariant under the transformation; an eigenvector with eigenvalue $-1$ is reflected by the transformation; and each pair of eigenvectors with complex conjugate eigenvalues spans a plane of rotation.






        share|cite|improve this answer













        Since an orthogonal matrix is normal, it is diagonalizable over $mathbb C$. Since it is unitary, its eigenvalues have magnitude $1$. Since its characteristic polynomial is real, its eigenvalues come in complex conjugate pairs. If you order the eigenvalues such that the pairs are consecutive, the diagonal blocks



        $$
        pmatrixmathrm e^mathrm iphi&0\0&mathrm e^-mathrm iphi
        $$



        can be transformed to



        $$
        pmatrixcosphi&-sinphi\sinphi&cosphi;.
        $$



        Thus, an orthogonal transformation can be written as the product (not sum) of reflections and rotations in planes. In three dimensions, specifying a plane of rotation and a rotation axis is equivalent, but only the specification by a plane generalizes to higher dimensions.



        An eigenvector with eigenvalue $1$ is invariant under the transformation; an eigenvector with eigenvalue $-1$ is reflected by the transformation; and each pair of eigenvectors with complex conjugate eigenvalues spans a plane of rotation.







        share|cite|improve this answer













        share|cite|improve this answer



        share|cite|improve this answer











        answered 2 days ago









        joriki

        164k10179328




        164k10179328






















             

            draft saved


            draft discarded


























             


            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2872052%2fare-orthonormal-matrices-rotations%23new-answer', 'question_page');

            );

            Post as a guest













































































            Comments

            Popular posts from this blog

            What is the equation of a 3D cone with generalised tilt?

            Color the edges and diagonals of a regular polygon

            Relationship between determinant of matrix and determinant of adjoint?