Sum of positive definite and symmetric matrix
Clash Royale CLAN TAG#URR8PPP
up vote
2
down vote
favorite
Let $A$ be a real, symmetric, positive definite $ntimes n$ matrix. Let $B$ be a real, symmetric $ntimes n$ matrix.
Does there exist an $epsilon>0$ such that $A+epsilon B$ is positive definite?
My attempt: According to Wikipedia we can simultaneously diagonalize $A$ and $B$. Since $A$ is positive definite, its eigenvalues are all positive. Call them $lambda_i$. Let $rho_i$ be the eigenvalues of $B$. Choose a basis which simultaneously diagonalizes $A$ and $B$. Then $$u^T(A+epsilon B)u=u^TAu+epsilon u^TBu=sum_i(lambda_i+epsilon rho_i)u_i^2.$$ So if $$epsilon<fraclambda_min$$ (assume $Bneq 0$), then $$lambda_i+epsilonrho_i>lambda_i-lambda_mingeq0$$ for all $i$, which implies $A+epsilon B$ is positive definite.
Is this valid? Is there a basis-independent way to show this?
real-analysis linear-algebra
add a comment |Â
up vote
2
down vote
favorite
Let $A$ be a real, symmetric, positive definite $ntimes n$ matrix. Let $B$ be a real, symmetric $ntimes n$ matrix.
Does there exist an $epsilon>0$ such that $A+epsilon B$ is positive definite?
My attempt: According to Wikipedia we can simultaneously diagonalize $A$ and $B$. Since $A$ is positive definite, its eigenvalues are all positive. Call them $lambda_i$. Let $rho_i$ be the eigenvalues of $B$. Choose a basis which simultaneously diagonalizes $A$ and $B$. Then $$u^T(A+epsilon B)u=u^TAu+epsilon u^TBu=sum_i(lambda_i+epsilon rho_i)u_i^2.$$ So if $$epsilon<fraclambda_min$$ (assume $Bneq 0$), then $$lambda_i+epsilonrho_i>lambda_i-lambda_mingeq0$$ for all $i$, which implies $A+epsilon B$ is positive definite.
Is this valid? Is there a basis-independent way to show this?
real-analysis linear-algebra
Can you provide a link to where Wikipedia says A and B can be simultaneously diagonalized?
– John Polcari
Jul 17 at 2:03
en.wikipedia.org/wiki/…
– srp
Jul 17 at 2:07
1
If you carefully read the entry you point to, you will find that it discusses a form of joint diagonalization which does not lead to the traditional eigenvalues for the two matrices. At least one of the diagonal results is necessarily the identity matrix.
– John Polcari
Jul 17 at 2:21
add a comment |Â
up vote
2
down vote
favorite
up vote
2
down vote
favorite
Let $A$ be a real, symmetric, positive definite $ntimes n$ matrix. Let $B$ be a real, symmetric $ntimes n$ matrix.
Does there exist an $epsilon>0$ such that $A+epsilon B$ is positive definite?
My attempt: According to Wikipedia we can simultaneously diagonalize $A$ and $B$. Since $A$ is positive definite, its eigenvalues are all positive. Call them $lambda_i$. Let $rho_i$ be the eigenvalues of $B$. Choose a basis which simultaneously diagonalizes $A$ and $B$. Then $$u^T(A+epsilon B)u=u^TAu+epsilon u^TBu=sum_i(lambda_i+epsilon rho_i)u_i^2.$$ So if $$epsilon<fraclambda_min$$ (assume $Bneq 0$), then $$lambda_i+epsilonrho_i>lambda_i-lambda_mingeq0$$ for all $i$, which implies $A+epsilon B$ is positive definite.
Is this valid? Is there a basis-independent way to show this?
real-analysis linear-algebra
Let $A$ be a real, symmetric, positive definite $ntimes n$ matrix. Let $B$ be a real, symmetric $ntimes n$ matrix.
Does there exist an $epsilon>0$ such that $A+epsilon B$ is positive definite?
My attempt: According to Wikipedia we can simultaneously diagonalize $A$ and $B$. Since $A$ is positive definite, its eigenvalues are all positive. Call them $lambda_i$. Let $rho_i$ be the eigenvalues of $B$. Choose a basis which simultaneously diagonalizes $A$ and $B$. Then $$u^T(A+epsilon B)u=u^TAu+epsilon u^TBu=sum_i(lambda_i+epsilon rho_i)u_i^2.$$ So if $$epsilon<fraclambda_min$$ (assume $Bneq 0$), then $$lambda_i+epsilonrho_i>lambda_i-lambda_mingeq0$$ for all $i$, which implies $A+epsilon B$ is positive definite.
Is this valid? Is there a basis-independent way to show this?
real-analysis linear-algebra
edited Jul 17 at 1:54
asked Jul 17 at 1:38
srp
1057
1057
Can you provide a link to where Wikipedia says A and B can be simultaneously diagonalized?
– John Polcari
Jul 17 at 2:03
en.wikipedia.org/wiki/…
– srp
Jul 17 at 2:07
1
If you carefully read the entry you point to, you will find that it discusses a form of joint diagonalization which does not lead to the traditional eigenvalues for the two matrices. At least one of the diagonal results is necessarily the identity matrix.
– John Polcari
Jul 17 at 2:21
add a comment |Â
Can you provide a link to where Wikipedia says A and B can be simultaneously diagonalized?
– John Polcari
Jul 17 at 2:03
en.wikipedia.org/wiki/…
– srp
Jul 17 at 2:07
1
If you carefully read the entry you point to, you will find that it discusses a form of joint diagonalization which does not lead to the traditional eigenvalues for the two matrices. At least one of the diagonal results is necessarily the identity matrix.
– John Polcari
Jul 17 at 2:21
Can you provide a link to where Wikipedia says A and B can be simultaneously diagonalized?
– John Polcari
Jul 17 at 2:03
Can you provide a link to where Wikipedia says A and B can be simultaneously diagonalized?
– John Polcari
Jul 17 at 2:03
en.wikipedia.org/wiki/…
– srp
Jul 17 at 2:07
en.wikipedia.org/wiki/…
– srp
Jul 17 at 2:07
1
1
If you carefully read the entry you point to, you will find that it discusses a form of joint diagonalization which does not lead to the traditional eigenvalues for the two matrices. At least one of the diagonal results is necessarily the identity matrix.
– John Polcari
Jul 17 at 2:21
If you carefully read the entry you point to, you will find that it discusses a form of joint diagonalization which does not lead to the traditional eigenvalues for the two matrices. At least one of the diagonal results is necessarily the identity matrix.
– John Polcari
Jul 17 at 2:21
add a comment |Â
2 Answers
2
active
oldest
votes
up vote
2
down vote
accepted
The sought for conclusion binds since the eigenvalues of a matrix depend continuously on its entries; then as $A + epsilon B$ is continuous in $epsilon$, and agrees with $A$ for $epsilon = 0$, all eigenvalues of the symmetric matrix $A + epsilon B$ are positive for $vert epsilon vert$ sufficiently small, so $A + epsilon B$ is positive definite for such $epsilon$.
Well what do you know? en.m.wikipedia.org/wiki/… !!!
– Robert Lewis
Jul 17 at 2:18
add a comment |Â
up vote
4
down vote
You can make this line of reasoning work, but you should be careful about what "diagonalized" means in this context. In particular, your wiki page is talking about diagonalization via a congruence rather than a similarity.
In particular, the theorem is that there exists an invertible matrix $P$ (not necessarily orthogonal) such that both $PAP^T$ and $PBP^T$ are diagonal. Note that the diagonal entries will not generally be the eigenvalues of $A$ and $B$.
Note that we can't necessarily diagonalize $A$ and $B$ simultaneously in the sense of similarity. In fact, we can do so if and only if $AB = BA$.
It is generally true that if $A$ is symmetric and positive definite, then such an $epsilon$ exists. One proof is as follows:
If $A$ is positive definite, then for some $r > 0$ (e.g. $r = lambda_min$) we can write
$$
A = rI + (A - rI)
$$
where $I$ denotes the identity matrix and $A - rI$ is positive semidefinite. With that, we have
$$
A + Bepsilon = [rI + epsilon B] + (A - rI)
$$
It suffices to choose an $epsilon$ such that $rI + epsilon B$ is positive definite.
Another approach that I like: let $A^-1/2$ denote the positive definite square root of $A^-1$. Then $A + epsilon B$ is positive definite if and only if the matrix
$$
A^-1/2(A + epsilon B)A^-1/2 = I + epsilon A^-1/2B(A^-1/2)^T
$$
is positive definite. It therefore suffices to consider the case with $A = I$.
If you choose to diagonalize $I + epsilon A^-1/2B(A^-1/2)^T$ by diagonalizing the symmetric matrix $A^-1/2B(A^-1/2)^T$, then you are essentially deriving the "simultaneous diagonalizability" of quadratic forms in our specific case.
add a comment |Â
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
2
down vote
accepted
The sought for conclusion binds since the eigenvalues of a matrix depend continuously on its entries; then as $A + epsilon B$ is continuous in $epsilon$, and agrees with $A$ for $epsilon = 0$, all eigenvalues of the symmetric matrix $A + epsilon B$ are positive for $vert epsilon vert$ sufficiently small, so $A + epsilon B$ is positive definite for such $epsilon$.
Well what do you know? en.m.wikipedia.org/wiki/… !!!
– Robert Lewis
Jul 17 at 2:18
add a comment |Â
up vote
2
down vote
accepted
The sought for conclusion binds since the eigenvalues of a matrix depend continuously on its entries; then as $A + epsilon B$ is continuous in $epsilon$, and agrees with $A$ for $epsilon = 0$, all eigenvalues of the symmetric matrix $A + epsilon B$ are positive for $vert epsilon vert$ sufficiently small, so $A + epsilon B$ is positive definite for such $epsilon$.
Well what do you know? en.m.wikipedia.org/wiki/… !!!
– Robert Lewis
Jul 17 at 2:18
add a comment |Â
up vote
2
down vote
accepted
up vote
2
down vote
accepted
The sought for conclusion binds since the eigenvalues of a matrix depend continuously on its entries; then as $A + epsilon B$ is continuous in $epsilon$, and agrees with $A$ for $epsilon = 0$, all eigenvalues of the symmetric matrix $A + epsilon B$ are positive for $vert epsilon vert$ sufficiently small, so $A + epsilon B$ is positive definite for such $epsilon$.
The sought for conclusion binds since the eigenvalues of a matrix depend continuously on its entries; then as $A + epsilon B$ is continuous in $epsilon$, and agrees with $A$ for $epsilon = 0$, all eigenvalues of the symmetric matrix $A + epsilon B$ are positive for $vert epsilon vert$ sufficiently small, so $A + epsilon B$ is positive definite for such $epsilon$.
edited Jul 17 at 2:21
answered Jul 17 at 2:13


Robert Lewis
37.1k22256
37.1k22256
Well what do you know? en.m.wikipedia.org/wiki/… !!!
– Robert Lewis
Jul 17 at 2:18
add a comment |Â
Well what do you know? en.m.wikipedia.org/wiki/… !!!
– Robert Lewis
Jul 17 at 2:18
Well what do you know? en.m.wikipedia.org/wiki/… !!!
– Robert Lewis
Jul 17 at 2:18
Well what do you know? en.m.wikipedia.org/wiki/… !!!
– Robert Lewis
Jul 17 at 2:18
add a comment |Â
up vote
4
down vote
You can make this line of reasoning work, but you should be careful about what "diagonalized" means in this context. In particular, your wiki page is talking about diagonalization via a congruence rather than a similarity.
In particular, the theorem is that there exists an invertible matrix $P$ (not necessarily orthogonal) such that both $PAP^T$ and $PBP^T$ are diagonal. Note that the diagonal entries will not generally be the eigenvalues of $A$ and $B$.
Note that we can't necessarily diagonalize $A$ and $B$ simultaneously in the sense of similarity. In fact, we can do so if and only if $AB = BA$.
It is generally true that if $A$ is symmetric and positive definite, then such an $epsilon$ exists. One proof is as follows:
If $A$ is positive definite, then for some $r > 0$ (e.g. $r = lambda_min$) we can write
$$
A = rI + (A - rI)
$$
where $I$ denotes the identity matrix and $A - rI$ is positive semidefinite. With that, we have
$$
A + Bepsilon = [rI + epsilon B] + (A - rI)
$$
It suffices to choose an $epsilon$ such that $rI + epsilon B$ is positive definite.
Another approach that I like: let $A^-1/2$ denote the positive definite square root of $A^-1$. Then $A + epsilon B$ is positive definite if and only if the matrix
$$
A^-1/2(A + epsilon B)A^-1/2 = I + epsilon A^-1/2B(A^-1/2)^T
$$
is positive definite. It therefore suffices to consider the case with $A = I$.
If you choose to diagonalize $I + epsilon A^-1/2B(A^-1/2)^T$ by diagonalizing the symmetric matrix $A^-1/2B(A^-1/2)^T$, then you are essentially deriving the "simultaneous diagonalizability" of quadratic forms in our specific case.
add a comment |Â
up vote
4
down vote
You can make this line of reasoning work, but you should be careful about what "diagonalized" means in this context. In particular, your wiki page is talking about diagonalization via a congruence rather than a similarity.
In particular, the theorem is that there exists an invertible matrix $P$ (not necessarily orthogonal) such that both $PAP^T$ and $PBP^T$ are diagonal. Note that the diagonal entries will not generally be the eigenvalues of $A$ and $B$.
Note that we can't necessarily diagonalize $A$ and $B$ simultaneously in the sense of similarity. In fact, we can do so if and only if $AB = BA$.
It is generally true that if $A$ is symmetric and positive definite, then such an $epsilon$ exists. One proof is as follows:
If $A$ is positive definite, then for some $r > 0$ (e.g. $r = lambda_min$) we can write
$$
A = rI + (A - rI)
$$
where $I$ denotes the identity matrix and $A - rI$ is positive semidefinite. With that, we have
$$
A + Bepsilon = [rI + epsilon B] + (A - rI)
$$
It suffices to choose an $epsilon$ such that $rI + epsilon B$ is positive definite.
Another approach that I like: let $A^-1/2$ denote the positive definite square root of $A^-1$. Then $A + epsilon B$ is positive definite if and only if the matrix
$$
A^-1/2(A + epsilon B)A^-1/2 = I + epsilon A^-1/2B(A^-1/2)^T
$$
is positive definite. It therefore suffices to consider the case with $A = I$.
If you choose to diagonalize $I + epsilon A^-1/2B(A^-1/2)^T$ by diagonalizing the symmetric matrix $A^-1/2B(A^-1/2)^T$, then you are essentially deriving the "simultaneous diagonalizability" of quadratic forms in our specific case.
add a comment |Â
up vote
4
down vote
up vote
4
down vote
You can make this line of reasoning work, but you should be careful about what "diagonalized" means in this context. In particular, your wiki page is talking about diagonalization via a congruence rather than a similarity.
In particular, the theorem is that there exists an invertible matrix $P$ (not necessarily orthogonal) such that both $PAP^T$ and $PBP^T$ are diagonal. Note that the diagonal entries will not generally be the eigenvalues of $A$ and $B$.
Note that we can't necessarily diagonalize $A$ and $B$ simultaneously in the sense of similarity. In fact, we can do so if and only if $AB = BA$.
It is generally true that if $A$ is symmetric and positive definite, then such an $epsilon$ exists. One proof is as follows:
If $A$ is positive definite, then for some $r > 0$ (e.g. $r = lambda_min$) we can write
$$
A = rI + (A - rI)
$$
where $I$ denotes the identity matrix and $A - rI$ is positive semidefinite. With that, we have
$$
A + Bepsilon = [rI + epsilon B] + (A - rI)
$$
It suffices to choose an $epsilon$ such that $rI + epsilon B$ is positive definite.
Another approach that I like: let $A^-1/2$ denote the positive definite square root of $A^-1$. Then $A + epsilon B$ is positive definite if and only if the matrix
$$
A^-1/2(A + epsilon B)A^-1/2 = I + epsilon A^-1/2B(A^-1/2)^T
$$
is positive definite. It therefore suffices to consider the case with $A = I$.
If you choose to diagonalize $I + epsilon A^-1/2B(A^-1/2)^T$ by diagonalizing the symmetric matrix $A^-1/2B(A^-1/2)^T$, then you are essentially deriving the "simultaneous diagonalizability" of quadratic forms in our specific case.
You can make this line of reasoning work, but you should be careful about what "diagonalized" means in this context. In particular, your wiki page is talking about diagonalization via a congruence rather than a similarity.
In particular, the theorem is that there exists an invertible matrix $P$ (not necessarily orthogonal) such that both $PAP^T$ and $PBP^T$ are diagonal. Note that the diagonal entries will not generally be the eigenvalues of $A$ and $B$.
Note that we can't necessarily diagonalize $A$ and $B$ simultaneously in the sense of similarity. In fact, we can do so if and only if $AB = BA$.
It is generally true that if $A$ is symmetric and positive definite, then such an $epsilon$ exists. One proof is as follows:
If $A$ is positive definite, then for some $r > 0$ (e.g. $r = lambda_min$) we can write
$$
A = rI + (A - rI)
$$
where $I$ denotes the identity matrix and $A - rI$ is positive semidefinite. With that, we have
$$
A + Bepsilon = [rI + epsilon B] + (A - rI)
$$
It suffices to choose an $epsilon$ such that $rI + epsilon B$ is positive definite.
Another approach that I like: let $A^-1/2$ denote the positive definite square root of $A^-1$. Then $A + epsilon B$ is positive definite if and only if the matrix
$$
A^-1/2(A + epsilon B)A^-1/2 = I + epsilon A^-1/2B(A^-1/2)^T
$$
is positive definite. It therefore suffices to consider the case with $A = I$.
If you choose to diagonalize $I + epsilon A^-1/2B(A^-1/2)^T$ by diagonalizing the symmetric matrix $A^-1/2B(A^-1/2)^T$, then you are essentially deriving the "simultaneous diagonalizability" of quadratic forms in our specific case.
edited Jul 17 at 2:22
answered Jul 17 at 2:04
Omnomnomnom
121k784170
121k784170
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2854043%2fsum-of-positive-definite-and-symmetric-matrix%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Can you provide a link to where Wikipedia says A and B can be simultaneously diagonalized?
– John Polcari
Jul 17 at 2:03
en.wikipedia.org/wiki/…
– srp
Jul 17 at 2:07
1
If you carefully read the entry you point to, you will find that it discusses a form of joint diagonalization which does not lead to the traditional eigenvalues for the two matrices. At least one of the diagonal results is necessarily the identity matrix.
– John Polcari
Jul 17 at 2:21