Trace of a matrix by eigendecomposition

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
0
down vote

favorite












Let $A$ denote matrix based on another matrix $B$:
beginalign
A = (I + lambda B)^-1
endalign
$I$ is the identity matrix and $lambda$ is a coefficient.



Decomposing $B$ as $USU^T$ where $U^TU=I$:
beginalign
A = (I + lambda USU^T)^-1 = U(I + lambda S)^-1U^T quad text(eq. 1)
endalign
and the trace of $A$ is given as:
beginalign
operatornametr(A) = operatornametrleft(U(I + lambda S)^-1U^Tright) = operatornametrleft((I + lambda S)^-1U^TUright) = sum_i=1^n frac11 + lambda s_ii
endalign
where $s_ii$ are the eigenvalues of $B$.



Given diagonal matrix $X$, we define a new matrix $C$:
beginalign
C = (X + lambda B)^-1 X
endalign
Do the identities in eq. (1) hold when there is an arbitrary diagonal matrix in place of the identity matrix such that the trace of $C$ can be written simply as shown below?
beginalign
operatornametr(C) = sum_i=1^n fracx_iix_ii + lambda s_ii
endalign







share|cite|improve this question





















  • You're right - it's the trace that's equal. I've corrected.
    – hatmatrix
    Aug 2 at 11:54














up vote
0
down vote

favorite












Let $A$ denote matrix based on another matrix $B$:
beginalign
A = (I + lambda B)^-1
endalign
$I$ is the identity matrix and $lambda$ is a coefficient.



Decomposing $B$ as $USU^T$ where $U^TU=I$:
beginalign
A = (I + lambda USU^T)^-1 = U(I + lambda S)^-1U^T quad text(eq. 1)
endalign
and the trace of $A$ is given as:
beginalign
operatornametr(A) = operatornametrleft(U(I + lambda S)^-1U^Tright) = operatornametrleft((I + lambda S)^-1U^TUright) = sum_i=1^n frac11 + lambda s_ii
endalign
where $s_ii$ are the eigenvalues of $B$.



Given diagonal matrix $X$, we define a new matrix $C$:
beginalign
C = (X + lambda B)^-1 X
endalign
Do the identities in eq. (1) hold when there is an arbitrary diagonal matrix in place of the identity matrix such that the trace of $C$ can be written simply as shown below?
beginalign
operatornametr(C) = sum_i=1^n fracx_iix_ii + lambda s_ii
endalign







share|cite|improve this question





















  • You're right - it's the trace that's equal. I've corrected.
    – hatmatrix
    Aug 2 at 11:54












up vote
0
down vote

favorite









up vote
0
down vote

favorite











Let $A$ denote matrix based on another matrix $B$:
beginalign
A = (I + lambda B)^-1
endalign
$I$ is the identity matrix and $lambda$ is a coefficient.



Decomposing $B$ as $USU^T$ where $U^TU=I$:
beginalign
A = (I + lambda USU^T)^-1 = U(I + lambda S)^-1U^T quad text(eq. 1)
endalign
and the trace of $A$ is given as:
beginalign
operatornametr(A) = operatornametrleft(U(I + lambda S)^-1U^Tright) = operatornametrleft((I + lambda S)^-1U^TUright) = sum_i=1^n frac11 + lambda s_ii
endalign
where $s_ii$ are the eigenvalues of $B$.



Given diagonal matrix $X$, we define a new matrix $C$:
beginalign
C = (X + lambda B)^-1 X
endalign
Do the identities in eq. (1) hold when there is an arbitrary diagonal matrix in place of the identity matrix such that the trace of $C$ can be written simply as shown below?
beginalign
operatornametr(C) = sum_i=1^n fracx_iix_ii + lambda s_ii
endalign







share|cite|improve this question













Let $A$ denote matrix based on another matrix $B$:
beginalign
A = (I + lambda B)^-1
endalign
$I$ is the identity matrix and $lambda$ is a coefficient.



Decomposing $B$ as $USU^T$ where $U^TU=I$:
beginalign
A = (I + lambda USU^T)^-1 = U(I + lambda S)^-1U^T quad text(eq. 1)
endalign
and the trace of $A$ is given as:
beginalign
operatornametr(A) = operatornametrleft(U(I + lambda S)^-1U^Tright) = operatornametrleft((I + lambda S)^-1U^TUright) = sum_i=1^n frac11 + lambda s_ii
endalign
where $s_ii$ are the eigenvalues of $B$.



Given diagonal matrix $X$, we define a new matrix $C$:
beginalign
C = (X + lambda B)^-1 X
endalign
Do the identities in eq. (1) hold when there is an arbitrary diagonal matrix in place of the identity matrix such that the trace of $C$ can be written simply as shown below?
beginalign
operatornametr(C) = sum_i=1^n fracx_iix_ii + lambda s_ii
endalign









share|cite|improve this question












share|cite|improve this question




share|cite|improve this question








edited Aug 2 at 11:53
























asked Aug 2 at 11:30









hatmatrix

1758




1758











  • You're right - it's the trace that's equal. I've corrected.
    – hatmatrix
    Aug 2 at 11:54
















  • You're right - it's the trace that's equal. I've corrected.
    – hatmatrix
    Aug 2 at 11:54















You're right - it's the trace that's equal. I've corrected.
– hatmatrix
Aug 2 at 11:54




You're right - it's the trace that's equal. I've corrected.
– hatmatrix
Aug 2 at 11:54










1 Answer
1






active

oldest

votes

















up vote
1
down vote



accepted










If $X$ commutes with $B$, and (possibly not necessarily) if $X$ or $B$ is invertible, then the claim is true for $lambda$ such that $X+lambda ,B$ is invertible, provided that the ordering of $left(x_i,iright)_i=1,2,ldots,n$ and the ordering of $left(s_i,iright)_i=1,2,ldots,n$ are compatible. If not, then the claim can fail. Here is a counterexample.



Let $X:=beginbmatrix1&0\0&2endbmatrix$ and $B:=beginbmatrix0&1\1&0endbmatrix$. Then, $x_1,1=1$, $x_2,2=2$, and $bigs_1,1,s_2,2big=-1,+1$. Now, for $lambda:=1$, we get
$$C=(X+lambda,B)^-1,X=beginbmatrix1&1\1&2endbmatrix^-1,beginbmatrix1&0\0&2endbmatrix=beginbmatrix2&-1\-1&1endbmatrix,beginbmatrix1&0\0&2endbmatrix=beginbmatrix2&-2\-1&2endbmatrix,.$$
Thus, $$texttrace(C)=4neq sumlimits_i=1^2,fracx_i,ix_i,i+lambda,s_i,i,,$$
as the only way the right-hand side is defined is when $s_1,1=+1$ and $s_2,2=-1$, which makes
$$ sumlimits_i=1^2,fracx_i,ix_i,i+lambda,s_i,i=frac11+1+frac22-1=frac52,.$$






share|cite|improve this answer























  • Thanks for the example. Is it conceivable that there is an expression as simple as $sum_i (1/1+lambda s_i)$ when the matrix is not the identity matrix?
    – hatmatrix
    Aug 2 at 12:35






  • 1




    Not that I know of, and I don't expect that there is anything simple. Inverting a matrix often result in something very nasty.
    – Batominovski
    Aug 2 at 13:07










Your Answer




StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: false,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);








 

draft saved


draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2869975%2ftrace-of-a-matrix-by-eigendecomposition%23new-answer', 'question_page');

);

Post as a guest






























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes








up vote
1
down vote



accepted










If $X$ commutes with $B$, and (possibly not necessarily) if $X$ or $B$ is invertible, then the claim is true for $lambda$ such that $X+lambda ,B$ is invertible, provided that the ordering of $left(x_i,iright)_i=1,2,ldots,n$ and the ordering of $left(s_i,iright)_i=1,2,ldots,n$ are compatible. If not, then the claim can fail. Here is a counterexample.



Let $X:=beginbmatrix1&0\0&2endbmatrix$ and $B:=beginbmatrix0&1\1&0endbmatrix$. Then, $x_1,1=1$, $x_2,2=2$, and $bigs_1,1,s_2,2big=-1,+1$. Now, for $lambda:=1$, we get
$$C=(X+lambda,B)^-1,X=beginbmatrix1&1\1&2endbmatrix^-1,beginbmatrix1&0\0&2endbmatrix=beginbmatrix2&-1\-1&1endbmatrix,beginbmatrix1&0\0&2endbmatrix=beginbmatrix2&-2\-1&2endbmatrix,.$$
Thus, $$texttrace(C)=4neq sumlimits_i=1^2,fracx_i,ix_i,i+lambda,s_i,i,,$$
as the only way the right-hand side is defined is when $s_1,1=+1$ and $s_2,2=-1$, which makes
$$ sumlimits_i=1^2,fracx_i,ix_i,i+lambda,s_i,i=frac11+1+frac22-1=frac52,.$$






share|cite|improve this answer























  • Thanks for the example. Is it conceivable that there is an expression as simple as $sum_i (1/1+lambda s_i)$ when the matrix is not the identity matrix?
    – hatmatrix
    Aug 2 at 12:35






  • 1




    Not that I know of, and I don't expect that there is anything simple. Inverting a matrix often result in something very nasty.
    – Batominovski
    Aug 2 at 13:07














up vote
1
down vote



accepted










If $X$ commutes with $B$, and (possibly not necessarily) if $X$ or $B$ is invertible, then the claim is true for $lambda$ such that $X+lambda ,B$ is invertible, provided that the ordering of $left(x_i,iright)_i=1,2,ldots,n$ and the ordering of $left(s_i,iright)_i=1,2,ldots,n$ are compatible. If not, then the claim can fail. Here is a counterexample.



Let $X:=beginbmatrix1&0\0&2endbmatrix$ and $B:=beginbmatrix0&1\1&0endbmatrix$. Then, $x_1,1=1$, $x_2,2=2$, and $bigs_1,1,s_2,2big=-1,+1$. Now, for $lambda:=1$, we get
$$C=(X+lambda,B)^-1,X=beginbmatrix1&1\1&2endbmatrix^-1,beginbmatrix1&0\0&2endbmatrix=beginbmatrix2&-1\-1&1endbmatrix,beginbmatrix1&0\0&2endbmatrix=beginbmatrix2&-2\-1&2endbmatrix,.$$
Thus, $$texttrace(C)=4neq sumlimits_i=1^2,fracx_i,ix_i,i+lambda,s_i,i,,$$
as the only way the right-hand side is defined is when $s_1,1=+1$ and $s_2,2=-1$, which makes
$$ sumlimits_i=1^2,fracx_i,ix_i,i+lambda,s_i,i=frac11+1+frac22-1=frac52,.$$






share|cite|improve this answer























  • Thanks for the example. Is it conceivable that there is an expression as simple as $sum_i (1/1+lambda s_i)$ when the matrix is not the identity matrix?
    – hatmatrix
    Aug 2 at 12:35






  • 1




    Not that I know of, and I don't expect that there is anything simple. Inverting a matrix often result in something very nasty.
    – Batominovski
    Aug 2 at 13:07












up vote
1
down vote



accepted







up vote
1
down vote



accepted






If $X$ commutes with $B$, and (possibly not necessarily) if $X$ or $B$ is invertible, then the claim is true for $lambda$ such that $X+lambda ,B$ is invertible, provided that the ordering of $left(x_i,iright)_i=1,2,ldots,n$ and the ordering of $left(s_i,iright)_i=1,2,ldots,n$ are compatible. If not, then the claim can fail. Here is a counterexample.



Let $X:=beginbmatrix1&0\0&2endbmatrix$ and $B:=beginbmatrix0&1\1&0endbmatrix$. Then, $x_1,1=1$, $x_2,2=2$, and $bigs_1,1,s_2,2big=-1,+1$. Now, for $lambda:=1$, we get
$$C=(X+lambda,B)^-1,X=beginbmatrix1&1\1&2endbmatrix^-1,beginbmatrix1&0\0&2endbmatrix=beginbmatrix2&-1\-1&1endbmatrix,beginbmatrix1&0\0&2endbmatrix=beginbmatrix2&-2\-1&2endbmatrix,.$$
Thus, $$texttrace(C)=4neq sumlimits_i=1^2,fracx_i,ix_i,i+lambda,s_i,i,,$$
as the only way the right-hand side is defined is when $s_1,1=+1$ and $s_2,2=-1$, which makes
$$ sumlimits_i=1^2,fracx_i,ix_i,i+lambda,s_i,i=frac11+1+frac22-1=frac52,.$$






share|cite|improve this answer















If $X$ commutes with $B$, and (possibly not necessarily) if $X$ or $B$ is invertible, then the claim is true for $lambda$ such that $X+lambda ,B$ is invertible, provided that the ordering of $left(x_i,iright)_i=1,2,ldots,n$ and the ordering of $left(s_i,iright)_i=1,2,ldots,n$ are compatible. If not, then the claim can fail. Here is a counterexample.



Let $X:=beginbmatrix1&0\0&2endbmatrix$ and $B:=beginbmatrix0&1\1&0endbmatrix$. Then, $x_1,1=1$, $x_2,2=2$, and $bigs_1,1,s_2,2big=-1,+1$. Now, for $lambda:=1$, we get
$$C=(X+lambda,B)^-1,X=beginbmatrix1&1\1&2endbmatrix^-1,beginbmatrix1&0\0&2endbmatrix=beginbmatrix2&-1\-1&1endbmatrix,beginbmatrix1&0\0&2endbmatrix=beginbmatrix2&-2\-1&2endbmatrix,.$$
Thus, $$texttrace(C)=4neq sumlimits_i=1^2,fracx_i,ix_i,i+lambda,s_i,i,,$$
as the only way the right-hand side is defined is when $s_1,1=+1$ and $s_2,2=-1$, which makes
$$ sumlimits_i=1^2,fracx_i,ix_i,i+lambda,s_i,i=frac11+1+frac22-1=frac52,.$$







share|cite|improve this answer















share|cite|improve this answer



share|cite|improve this answer








edited Aug 2 at 12:07


























answered Aug 2 at 12:01









Batominovski

22.7k22776




22.7k22776











  • Thanks for the example. Is it conceivable that there is an expression as simple as $sum_i (1/1+lambda s_i)$ when the matrix is not the identity matrix?
    – hatmatrix
    Aug 2 at 12:35






  • 1




    Not that I know of, and I don't expect that there is anything simple. Inverting a matrix often result in something very nasty.
    – Batominovski
    Aug 2 at 13:07
















  • Thanks for the example. Is it conceivable that there is an expression as simple as $sum_i (1/1+lambda s_i)$ when the matrix is not the identity matrix?
    – hatmatrix
    Aug 2 at 12:35






  • 1




    Not that I know of, and I don't expect that there is anything simple. Inverting a matrix often result in something very nasty.
    – Batominovski
    Aug 2 at 13:07















Thanks for the example. Is it conceivable that there is an expression as simple as $sum_i (1/1+lambda s_i)$ when the matrix is not the identity matrix?
– hatmatrix
Aug 2 at 12:35




Thanks for the example. Is it conceivable that there is an expression as simple as $sum_i (1/1+lambda s_i)$ when the matrix is not the identity matrix?
– hatmatrix
Aug 2 at 12:35




1




1




Not that I know of, and I don't expect that there is anything simple. Inverting a matrix often result in something very nasty.
– Batominovski
Aug 2 at 13:07




Not that I know of, and I don't expect that there is anything simple. Inverting a matrix often result in something very nasty.
– Batominovski
Aug 2 at 13:07












 

draft saved


draft discarded


























 


draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2869975%2ftrace-of-a-matrix-by-eigendecomposition%23new-answer', 'question_page');

);

Post as a guest













































































Comments

Popular posts from this blog

What is the equation of a 3D cone with generalised tilt?

Color the edges and diagonals of a regular polygon

Relationship between determinant of matrix and determinant of adjoint?