How to prove this matrix inequality?

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
0
down vote

favorite












Given the following two conditions:



(1). $A$ and $B$ are $ntimes n$ matrices;



(2). $U$ is a unitary $ntimes n$ matrix, i.e., $UU^+=U^+U=1$



How to prove this matrix inequality:
tr$left( AUBU^+right) leq $tr$left( ABright) $ ?







share|cite|improve this question























    up vote
    0
    down vote

    favorite












    Given the following two conditions:



    (1). $A$ and $B$ are $ntimes n$ matrices;



    (2). $U$ is a unitary $ntimes n$ matrix, i.e., $UU^+=U^+U=1$



    How to prove this matrix inequality:
    tr$left( AUBU^+right) leq $tr$left( ABright) $ ?







    share|cite|improve this question





















      up vote
      0
      down vote

      favorite









      up vote
      0
      down vote

      favorite











      Given the following two conditions:



      (1). $A$ and $B$ are $ntimes n$ matrices;



      (2). $U$ is a unitary $ntimes n$ matrix, i.e., $UU^+=U^+U=1$



      How to prove this matrix inequality:
      tr$left( AUBU^+right) leq $tr$left( ABright) $ ?







      share|cite|improve this question











      Given the following two conditions:



      (1). $A$ and $B$ are $ntimes n$ matrices;



      (2). $U$ is a unitary $ntimes n$ matrix, i.e., $UU^+=U^+U=1$



      How to prove this matrix inequality:
      tr$left( AUBU^+right) leq $tr$left( ABright) $ ?









      share|cite|improve this question










      share|cite|improve this question




      share|cite|improve this question









      asked Jul 18 at 22:35









      Daniel Kim

      213




      213




















          2 Answers
          2






          active

          oldest

          votes

















          up vote
          1
          down vote













          Here is a proof for real matrices. I don't think it makes sense to prove this for complex matrices as trace can then be complex valued and they cannot be ordered.



          By Spectral Theorem for normal matrices, we get that for unitary matrix $U$ is diagonalizable and in particular all of its eigenvalues have radius 1. For the proof, observe the following: let $U$ be unitary and $Uv=lambda v$. Then,



          $$||Uv||^2 = langle Uv, Uv rangle = langle v, U^*Uvrangle= ||v||^2$$
          Also,
          $$||Uv||^2 = |lambda|^2 ||v||^2$$



          Hence,$ |lambda| = 1$ or $lambda=-1,1$.



          Then, using properties of trace,



          $$tr(AUBU^*) leq tr(AUB)tr(U^*) = tr(UBA)tr(U^*) leq tr(U)tr(U^*)tr(BA) = tr(AB)$$



          where the last equality is from the observation that trace of a matrix is the sum of eigenvalues and that for unitary matrix $U$ and $U^*$, the eigenvalues are again $-1,1$ with same multiplicities.



          EDIT:
          The last assertion still needs work and may need more assumptions.






          share|cite|improve this answer























          • Trace of an identity is not $1$ except for a very special case.
            – Algebraic Pavel
            Jul 18 at 23:33










          • Yes I realized that moments after I answered.
            – James Yang
            Jul 18 at 23:34

















          up vote
          0
          down vote













          This is only true for Hermitian matrices and does not need to hold otherwise.
          For example, if
          $$
          A=B=beginbmatrix1 & 1\0 & 1endbmatrix,
          quad
          U=beginbmatrix0 & 1\ 1 & 0endbmatrix,
          $$
          then
          $$
          mathrmtrace(AUBU^*)=3notleq 2=mathrmtrace(AB).
          $$






          share|cite|improve this answer





















            Your Answer




            StackExchange.ifUsing("editor", function ()
            return StackExchange.using("mathjaxEditing", function ()
            StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
            StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
            );
            );
            , "mathjax-editing");

            StackExchange.ready(function()
            var channelOptions =
            tags: "".split(" "),
            id: "69"
            ;
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function()
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled)
            StackExchange.using("snippets", function()
            createEditor();
            );

            else
            createEditor();

            );

            function createEditor()
            StackExchange.prepareEditor(
            heartbeatType: 'answer',
            convertImagesToLinks: true,
            noModals: false,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            noCode: true, onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            );



            );








             

            draft saved


            draft discarded


















            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2856086%2fhow-to-prove-this-matrix-inequality%23new-answer', 'question_page');

            );

            Post as a guest






























            2 Answers
            2






            active

            oldest

            votes








            2 Answers
            2






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes








            up vote
            1
            down vote













            Here is a proof for real matrices. I don't think it makes sense to prove this for complex matrices as trace can then be complex valued and they cannot be ordered.



            By Spectral Theorem for normal matrices, we get that for unitary matrix $U$ is diagonalizable and in particular all of its eigenvalues have radius 1. For the proof, observe the following: let $U$ be unitary and $Uv=lambda v$. Then,



            $$||Uv||^2 = langle Uv, Uv rangle = langle v, U^*Uvrangle= ||v||^2$$
            Also,
            $$||Uv||^2 = |lambda|^2 ||v||^2$$



            Hence,$ |lambda| = 1$ or $lambda=-1,1$.



            Then, using properties of trace,



            $$tr(AUBU^*) leq tr(AUB)tr(U^*) = tr(UBA)tr(U^*) leq tr(U)tr(U^*)tr(BA) = tr(AB)$$



            where the last equality is from the observation that trace of a matrix is the sum of eigenvalues and that for unitary matrix $U$ and $U^*$, the eigenvalues are again $-1,1$ with same multiplicities.



            EDIT:
            The last assertion still needs work and may need more assumptions.






            share|cite|improve this answer























            • Trace of an identity is not $1$ except for a very special case.
              – Algebraic Pavel
              Jul 18 at 23:33










            • Yes I realized that moments after I answered.
              – James Yang
              Jul 18 at 23:34














            up vote
            1
            down vote













            Here is a proof for real matrices. I don't think it makes sense to prove this for complex matrices as trace can then be complex valued and they cannot be ordered.



            By Spectral Theorem for normal matrices, we get that for unitary matrix $U$ is diagonalizable and in particular all of its eigenvalues have radius 1. For the proof, observe the following: let $U$ be unitary and $Uv=lambda v$. Then,



            $$||Uv||^2 = langle Uv, Uv rangle = langle v, U^*Uvrangle= ||v||^2$$
            Also,
            $$||Uv||^2 = |lambda|^2 ||v||^2$$



            Hence,$ |lambda| = 1$ or $lambda=-1,1$.



            Then, using properties of trace,



            $$tr(AUBU^*) leq tr(AUB)tr(U^*) = tr(UBA)tr(U^*) leq tr(U)tr(U^*)tr(BA) = tr(AB)$$



            where the last equality is from the observation that trace of a matrix is the sum of eigenvalues and that for unitary matrix $U$ and $U^*$, the eigenvalues are again $-1,1$ with same multiplicities.



            EDIT:
            The last assertion still needs work and may need more assumptions.






            share|cite|improve this answer























            • Trace of an identity is not $1$ except for a very special case.
              – Algebraic Pavel
              Jul 18 at 23:33










            • Yes I realized that moments after I answered.
              – James Yang
              Jul 18 at 23:34












            up vote
            1
            down vote










            up vote
            1
            down vote









            Here is a proof for real matrices. I don't think it makes sense to prove this for complex matrices as trace can then be complex valued and they cannot be ordered.



            By Spectral Theorem for normal matrices, we get that for unitary matrix $U$ is diagonalizable and in particular all of its eigenvalues have radius 1. For the proof, observe the following: let $U$ be unitary and $Uv=lambda v$. Then,



            $$||Uv||^2 = langle Uv, Uv rangle = langle v, U^*Uvrangle= ||v||^2$$
            Also,
            $$||Uv||^2 = |lambda|^2 ||v||^2$$



            Hence,$ |lambda| = 1$ or $lambda=-1,1$.



            Then, using properties of trace,



            $$tr(AUBU^*) leq tr(AUB)tr(U^*) = tr(UBA)tr(U^*) leq tr(U)tr(U^*)tr(BA) = tr(AB)$$



            where the last equality is from the observation that trace of a matrix is the sum of eigenvalues and that for unitary matrix $U$ and $U^*$, the eigenvalues are again $-1,1$ with same multiplicities.



            EDIT:
            The last assertion still needs work and may need more assumptions.






            share|cite|improve this answer















            Here is a proof for real matrices. I don't think it makes sense to prove this for complex matrices as trace can then be complex valued and they cannot be ordered.



            By Spectral Theorem for normal matrices, we get that for unitary matrix $U$ is diagonalizable and in particular all of its eigenvalues have radius 1. For the proof, observe the following: let $U$ be unitary and $Uv=lambda v$. Then,



            $$||Uv||^2 = langle Uv, Uv rangle = langle v, U^*Uvrangle= ||v||^2$$
            Also,
            $$||Uv||^2 = |lambda|^2 ||v||^2$$



            Hence,$ |lambda| = 1$ or $lambda=-1,1$.



            Then, using properties of trace,



            $$tr(AUBU^*) leq tr(AUB)tr(U^*) = tr(UBA)tr(U^*) leq tr(U)tr(U^*)tr(BA) = tr(AB)$$



            where the last equality is from the observation that trace of a matrix is the sum of eigenvalues and that for unitary matrix $U$ and $U^*$, the eigenvalues are again $-1,1$ with same multiplicities.



            EDIT:
            The last assertion still needs work and may need more assumptions.







            share|cite|improve this answer















            share|cite|improve this answer



            share|cite|improve this answer








            edited Jul 18 at 23:26


























            answered Jul 18 at 23:08









            James Yang

            4349




            4349











            • Trace of an identity is not $1$ except for a very special case.
              – Algebraic Pavel
              Jul 18 at 23:33










            • Yes I realized that moments after I answered.
              – James Yang
              Jul 18 at 23:34
















            • Trace of an identity is not $1$ except for a very special case.
              – Algebraic Pavel
              Jul 18 at 23:33










            • Yes I realized that moments after I answered.
              – James Yang
              Jul 18 at 23:34















            Trace of an identity is not $1$ except for a very special case.
            – Algebraic Pavel
            Jul 18 at 23:33




            Trace of an identity is not $1$ except for a very special case.
            – Algebraic Pavel
            Jul 18 at 23:33












            Yes I realized that moments after I answered.
            – James Yang
            Jul 18 at 23:34




            Yes I realized that moments after I answered.
            – James Yang
            Jul 18 at 23:34










            up vote
            0
            down vote













            This is only true for Hermitian matrices and does not need to hold otherwise.
            For example, if
            $$
            A=B=beginbmatrix1 & 1\0 & 1endbmatrix,
            quad
            U=beginbmatrix0 & 1\ 1 & 0endbmatrix,
            $$
            then
            $$
            mathrmtrace(AUBU^*)=3notleq 2=mathrmtrace(AB).
            $$






            share|cite|improve this answer

























              up vote
              0
              down vote













              This is only true for Hermitian matrices and does not need to hold otherwise.
              For example, if
              $$
              A=B=beginbmatrix1 & 1\0 & 1endbmatrix,
              quad
              U=beginbmatrix0 & 1\ 1 & 0endbmatrix,
              $$
              then
              $$
              mathrmtrace(AUBU^*)=3notleq 2=mathrmtrace(AB).
              $$






              share|cite|improve this answer























                up vote
                0
                down vote










                up vote
                0
                down vote









                This is only true for Hermitian matrices and does not need to hold otherwise.
                For example, if
                $$
                A=B=beginbmatrix1 & 1\0 & 1endbmatrix,
                quad
                U=beginbmatrix0 & 1\ 1 & 0endbmatrix,
                $$
                then
                $$
                mathrmtrace(AUBU^*)=3notleq 2=mathrmtrace(AB).
                $$






                share|cite|improve this answer













                This is only true for Hermitian matrices and does not need to hold otherwise.
                For example, if
                $$
                A=B=beginbmatrix1 & 1\0 & 1endbmatrix,
                quad
                U=beginbmatrix0 & 1\ 1 & 0endbmatrix,
                $$
                then
                $$
                mathrmtrace(AUBU^*)=3notleq 2=mathrmtrace(AB).
                $$







                share|cite|improve this answer













                share|cite|improve this answer



                share|cite|improve this answer











                answered Jul 20 at 9:07









                Algebraic Pavel

                15.7k31638




                15.7k31638






















                     

                    draft saved


                    draft discarded


























                     


                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function ()
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2856086%2fhow-to-prove-this-matrix-inequality%23new-answer', 'question_page');

                    );

                    Post as a guest













































































                    Comments

                    Popular posts from this blog

                    What is the equation of a 3D cone with generalised tilt?

                    Color the edges and diagonals of a regular polygon

                    Relationship between determinant of matrix and determinant of adjoint?