Prove that there is only a single eigenvector corresponding to each of the distinct eigenvalue
Clash Royale CLAN TAG#URR8PPP
up vote
1
down vote
favorite
Prove that if the $n$ eigenvalues of a matrix $A_ntimes n$ are distinct then
1) there are $n$ eigenvectors $bar c_i$), one corresponding to each of those eigenvalues and
2) that $bar c_1,bar c_2, dots,bar c_n$ are all linearly independent.
eigenvalues-eigenvectors
add a comment |Â
up vote
1
down vote
favorite
Prove that if the $n$ eigenvalues of a matrix $A_ntimes n$ are distinct then
1) there are $n$ eigenvectors $bar c_i$), one corresponding to each of those eigenvalues and
2) that $bar c_1,bar c_2, dots,bar c_n$ are all linearly independent.
eigenvalues-eigenvectors
add a comment |Â
up vote
1
down vote
favorite
up vote
1
down vote
favorite
Prove that if the $n$ eigenvalues of a matrix $A_ntimes n$ are distinct then
1) there are $n$ eigenvectors $bar c_i$), one corresponding to each of those eigenvalues and
2) that $bar c_1,bar c_2, dots,bar c_n$ are all linearly independent.
eigenvalues-eigenvectors
Prove that if the $n$ eigenvalues of a matrix $A_ntimes n$ are distinct then
1) there are $n$ eigenvectors $bar c_i$), one corresponding to each of those eigenvalues and
2) that $bar c_1,bar c_2, dots,bar c_n$ are all linearly independent.
eigenvalues-eigenvectors
asked Jul 21 at 20:00


Aditya
247314
247314
add a comment |Â
add a comment |Â
1 Answer
1
active
oldest
votes
up vote
1
down vote
I will seek to prove that if the $n$ eigenvalues of a matrix $A_ntimes n$ are distinct then 1) there are $n$ eigenvectors $bar c_i$), one corresponding to each of those eigenvalues
Definition: the algebraic multiplicity of an eigenvalue $a(lambda_i)$ is the power to which $(λ – lambda_i)$ divides the
characteristic polynomial.
So if a eigenvalue $lambda_j$ repeats $k$ times then $a(lambda_j) = k$.
So in the case of distinct eigenvalues $lambda_1,ldots,lambda_n$, we can conclude that for each of the $n$ eigenvalues, the algebraic multiplicity is 1.
$$a(lambda_i) = 1 space space space forall space spacelambda_i$$
Now what will the geometric multiplicities of these eigenvalues be?
Definition: The geometric multiplicity of an eigenvalue, $g(λ_i)$ is the dimension of the eigenspace $E_λ_i = N(A−λ_iI)$ corresponding to $λ_i$. Here N finds the null space of $A−λ_iI$.
The eigenspace $E_λ_i$ is best understood as the vector space spanned by all the eigenvectors that correspond to the eigenvalue $λ_i$, i.e. the collection of all vectors $bar v$ that satisfy $Abar v = λ_ibar v$ form the eigenspace.
An eigenspace has dimension greater than zero by definition. Since the definition of an eigenvalue is : λ is an eigenvalue of A if $Ax=λx$ for some $x≠0$. Since only the zero vector by itself has a dimension of zero,we can conclude $0 < g(λ_i)$ or $1 ≤ g(λ_i)$.
This is a brilliant proof for why $g(λ_i)≤a(λ_i)$, it shows that the characteristic polynomial will have $(lambda - λ_i)^g(λ_i)$ at least as a factor.
$$1 ≤ g(λ_i) ≤a(λ_i)$$
Thus in the case of distinct eigenvalues,
$$1 ≤ g(λ_i) ≤ 1 space space space forall space spacelambda_i$$
$$g(λ_i) = 1 space space space forall space spacelambda_i$$
$g(λ_i)$ is also equivalently the number of independent eigenvectors associated with $λ_i$, since if we want to span a vector space (the eigenspace) of dimension $g(λ_i)$, we need that many independent eigenvectors.
Thus we have proved that associated with each distinct eigenvalue is a single eigenvector.
1
Answering your own question is acceptable here. But that usually happens only if when you first asked you didn't know an answer, and you only found one after quite a while. Here you seem to have posted your answer right after asking. Why? If you are unsure of your proof you can put the proof in your question, flag the place where you have doubts, and use the proof-verification tag.
– Ethan Bolker
Jul 21 at 20:04
I only answered part 1, I actually wanted to answer this question - math.stackexchange.com/questions/29371/… using the method I gave above. That is why I posted half the solution. I hope to develop the answer later and provide a geometric intuition for part 2 of the question.
– Aditya
Jul 21 at 20:11
That question is seven years old and already has several really nice answers.
– Ethan Bolker
Jul 21 at 20:25
Oh, let me go through them again then. I didn't manage to grasp the inution for why distinct eigenspaces imply independence while not the converse. While the solutions are algebraically elegant. I wanted to visualise something like how each of the n eigenspaces are restricted to a line, we should be able to say that all n Eigenvectors are linearly independent, since together they should span $F^n$?
– Aditya
Jul 21 at 20:40
add a comment |Â
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
I will seek to prove that if the $n$ eigenvalues of a matrix $A_ntimes n$ are distinct then 1) there are $n$ eigenvectors $bar c_i$), one corresponding to each of those eigenvalues
Definition: the algebraic multiplicity of an eigenvalue $a(lambda_i)$ is the power to which $(λ – lambda_i)$ divides the
characteristic polynomial.
So if a eigenvalue $lambda_j$ repeats $k$ times then $a(lambda_j) = k$.
So in the case of distinct eigenvalues $lambda_1,ldots,lambda_n$, we can conclude that for each of the $n$ eigenvalues, the algebraic multiplicity is 1.
$$a(lambda_i) = 1 space space space forall space spacelambda_i$$
Now what will the geometric multiplicities of these eigenvalues be?
Definition: The geometric multiplicity of an eigenvalue, $g(λ_i)$ is the dimension of the eigenspace $E_λ_i = N(A−λ_iI)$ corresponding to $λ_i$. Here N finds the null space of $A−λ_iI$.
The eigenspace $E_λ_i$ is best understood as the vector space spanned by all the eigenvectors that correspond to the eigenvalue $λ_i$, i.e. the collection of all vectors $bar v$ that satisfy $Abar v = λ_ibar v$ form the eigenspace.
An eigenspace has dimension greater than zero by definition. Since the definition of an eigenvalue is : λ is an eigenvalue of A if $Ax=λx$ for some $x≠0$. Since only the zero vector by itself has a dimension of zero,we can conclude $0 < g(λ_i)$ or $1 ≤ g(λ_i)$.
This is a brilliant proof for why $g(λ_i)≤a(λ_i)$, it shows that the characteristic polynomial will have $(lambda - λ_i)^g(λ_i)$ at least as a factor.
$$1 ≤ g(λ_i) ≤a(λ_i)$$
Thus in the case of distinct eigenvalues,
$$1 ≤ g(λ_i) ≤ 1 space space space forall space spacelambda_i$$
$$g(λ_i) = 1 space space space forall space spacelambda_i$$
$g(λ_i)$ is also equivalently the number of independent eigenvectors associated with $λ_i$, since if we want to span a vector space (the eigenspace) of dimension $g(λ_i)$, we need that many independent eigenvectors.
Thus we have proved that associated with each distinct eigenvalue is a single eigenvector.
1
Answering your own question is acceptable here. But that usually happens only if when you first asked you didn't know an answer, and you only found one after quite a while. Here you seem to have posted your answer right after asking. Why? If you are unsure of your proof you can put the proof in your question, flag the place where you have doubts, and use the proof-verification tag.
– Ethan Bolker
Jul 21 at 20:04
I only answered part 1, I actually wanted to answer this question - math.stackexchange.com/questions/29371/… using the method I gave above. That is why I posted half the solution. I hope to develop the answer later and provide a geometric intuition for part 2 of the question.
– Aditya
Jul 21 at 20:11
That question is seven years old and already has several really nice answers.
– Ethan Bolker
Jul 21 at 20:25
Oh, let me go through them again then. I didn't manage to grasp the inution for why distinct eigenspaces imply independence while not the converse. While the solutions are algebraically elegant. I wanted to visualise something like how each of the n eigenspaces are restricted to a line, we should be able to say that all n Eigenvectors are linearly independent, since together they should span $F^n$?
– Aditya
Jul 21 at 20:40
add a comment |Â
up vote
1
down vote
I will seek to prove that if the $n$ eigenvalues of a matrix $A_ntimes n$ are distinct then 1) there are $n$ eigenvectors $bar c_i$), one corresponding to each of those eigenvalues
Definition: the algebraic multiplicity of an eigenvalue $a(lambda_i)$ is the power to which $(λ – lambda_i)$ divides the
characteristic polynomial.
So if a eigenvalue $lambda_j$ repeats $k$ times then $a(lambda_j) = k$.
So in the case of distinct eigenvalues $lambda_1,ldots,lambda_n$, we can conclude that for each of the $n$ eigenvalues, the algebraic multiplicity is 1.
$$a(lambda_i) = 1 space space space forall space spacelambda_i$$
Now what will the geometric multiplicities of these eigenvalues be?
Definition: The geometric multiplicity of an eigenvalue, $g(λ_i)$ is the dimension of the eigenspace $E_λ_i = N(A−λ_iI)$ corresponding to $λ_i$. Here N finds the null space of $A−λ_iI$.
The eigenspace $E_λ_i$ is best understood as the vector space spanned by all the eigenvectors that correspond to the eigenvalue $λ_i$, i.e. the collection of all vectors $bar v$ that satisfy $Abar v = λ_ibar v$ form the eigenspace.
An eigenspace has dimension greater than zero by definition. Since the definition of an eigenvalue is : λ is an eigenvalue of A if $Ax=λx$ for some $x≠0$. Since only the zero vector by itself has a dimension of zero,we can conclude $0 < g(λ_i)$ or $1 ≤ g(λ_i)$.
This is a brilliant proof for why $g(λ_i)≤a(λ_i)$, it shows that the characteristic polynomial will have $(lambda - λ_i)^g(λ_i)$ at least as a factor.
$$1 ≤ g(λ_i) ≤a(λ_i)$$
Thus in the case of distinct eigenvalues,
$$1 ≤ g(λ_i) ≤ 1 space space space forall space spacelambda_i$$
$$g(λ_i) = 1 space space space forall space spacelambda_i$$
$g(λ_i)$ is also equivalently the number of independent eigenvectors associated with $λ_i$, since if we want to span a vector space (the eigenspace) of dimension $g(λ_i)$, we need that many independent eigenvectors.
Thus we have proved that associated with each distinct eigenvalue is a single eigenvector.
1
Answering your own question is acceptable here. But that usually happens only if when you first asked you didn't know an answer, and you only found one after quite a while. Here you seem to have posted your answer right after asking. Why? If you are unsure of your proof you can put the proof in your question, flag the place where you have doubts, and use the proof-verification tag.
– Ethan Bolker
Jul 21 at 20:04
I only answered part 1, I actually wanted to answer this question - math.stackexchange.com/questions/29371/… using the method I gave above. That is why I posted half the solution. I hope to develop the answer later and provide a geometric intuition for part 2 of the question.
– Aditya
Jul 21 at 20:11
That question is seven years old and already has several really nice answers.
– Ethan Bolker
Jul 21 at 20:25
Oh, let me go through them again then. I didn't manage to grasp the inution for why distinct eigenspaces imply independence while not the converse. While the solutions are algebraically elegant. I wanted to visualise something like how each of the n eigenspaces are restricted to a line, we should be able to say that all n Eigenvectors are linearly independent, since together they should span $F^n$?
– Aditya
Jul 21 at 20:40
add a comment |Â
up vote
1
down vote
up vote
1
down vote
I will seek to prove that if the $n$ eigenvalues of a matrix $A_ntimes n$ are distinct then 1) there are $n$ eigenvectors $bar c_i$), one corresponding to each of those eigenvalues
Definition: the algebraic multiplicity of an eigenvalue $a(lambda_i)$ is the power to which $(λ – lambda_i)$ divides the
characteristic polynomial.
So if a eigenvalue $lambda_j$ repeats $k$ times then $a(lambda_j) = k$.
So in the case of distinct eigenvalues $lambda_1,ldots,lambda_n$, we can conclude that for each of the $n$ eigenvalues, the algebraic multiplicity is 1.
$$a(lambda_i) = 1 space space space forall space spacelambda_i$$
Now what will the geometric multiplicities of these eigenvalues be?
Definition: The geometric multiplicity of an eigenvalue, $g(λ_i)$ is the dimension of the eigenspace $E_λ_i = N(A−λ_iI)$ corresponding to $λ_i$. Here N finds the null space of $A−λ_iI$.
The eigenspace $E_λ_i$ is best understood as the vector space spanned by all the eigenvectors that correspond to the eigenvalue $λ_i$, i.e. the collection of all vectors $bar v$ that satisfy $Abar v = λ_ibar v$ form the eigenspace.
An eigenspace has dimension greater than zero by definition. Since the definition of an eigenvalue is : λ is an eigenvalue of A if $Ax=λx$ for some $x≠0$. Since only the zero vector by itself has a dimension of zero,we can conclude $0 < g(λ_i)$ or $1 ≤ g(λ_i)$.
This is a brilliant proof for why $g(λ_i)≤a(λ_i)$, it shows that the characteristic polynomial will have $(lambda - λ_i)^g(λ_i)$ at least as a factor.
$$1 ≤ g(λ_i) ≤a(λ_i)$$
Thus in the case of distinct eigenvalues,
$$1 ≤ g(λ_i) ≤ 1 space space space forall space spacelambda_i$$
$$g(λ_i) = 1 space space space forall space spacelambda_i$$
$g(λ_i)$ is also equivalently the number of independent eigenvectors associated with $λ_i$, since if we want to span a vector space (the eigenspace) of dimension $g(λ_i)$, we need that many independent eigenvectors.
Thus we have proved that associated with each distinct eigenvalue is a single eigenvector.
I will seek to prove that if the $n$ eigenvalues of a matrix $A_ntimes n$ are distinct then 1) there are $n$ eigenvectors $bar c_i$), one corresponding to each of those eigenvalues
Definition: the algebraic multiplicity of an eigenvalue $a(lambda_i)$ is the power to which $(λ – lambda_i)$ divides the
characteristic polynomial.
So if a eigenvalue $lambda_j$ repeats $k$ times then $a(lambda_j) = k$.
So in the case of distinct eigenvalues $lambda_1,ldots,lambda_n$, we can conclude that for each of the $n$ eigenvalues, the algebraic multiplicity is 1.
$$a(lambda_i) = 1 space space space forall space spacelambda_i$$
Now what will the geometric multiplicities of these eigenvalues be?
Definition: The geometric multiplicity of an eigenvalue, $g(λ_i)$ is the dimension of the eigenspace $E_λ_i = N(A−λ_iI)$ corresponding to $λ_i$. Here N finds the null space of $A−λ_iI$.
The eigenspace $E_λ_i$ is best understood as the vector space spanned by all the eigenvectors that correspond to the eigenvalue $λ_i$, i.e. the collection of all vectors $bar v$ that satisfy $Abar v = λ_ibar v$ form the eigenspace.
An eigenspace has dimension greater than zero by definition. Since the definition of an eigenvalue is : λ is an eigenvalue of A if $Ax=λx$ for some $x≠0$. Since only the zero vector by itself has a dimension of zero,we can conclude $0 < g(λ_i)$ or $1 ≤ g(λ_i)$.
This is a brilliant proof for why $g(λ_i)≤a(λ_i)$, it shows that the characteristic polynomial will have $(lambda - λ_i)^g(λ_i)$ at least as a factor.
$$1 ≤ g(λ_i) ≤a(λ_i)$$
Thus in the case of distinct eigenvalues,
$$1 ≤ g(λ_i) ≤ 1 space space space forall space spacelambda_i$$
$$g(λ_i) = 1 space space space forall space spacelambda_i$$
$g(λ_i)$ is also equivalently the number of independent eigenvectors associated with $λ_i$, since if we want to span a vector space (the eigenspace) of dimension $g(λ_i)$, we need that many independent eigenvectors.
Thus we have proved that associated with each distinct eigenvalue is a single eigenvector.
answered Jul 21 at 20:00


Aditya
247314
247314
1
Answering your own question is acceptable here. But that usually happens only if when you first asked you didn't know an answer, and you only found one after quite a while. Here you seem to have posted your answer right after asking. Why? If you are unsure of your proof you can put the proof in your question, flag the place where you have doubts, and use the proof-verification tag.
– Ethan Bolker
Jul 21 at 20:04
I only answered part 1, I actually wanted to answer this question - math.stackexchange.com/questions/29371/… using the method I gave above. That is why I posted half the solution. I hope to develop the answer later and provide a geometric intuition for part 2 of the question.
– Aditya
Jul 21 at 20:11
That question is seven years old and already has several really nice answers.
– Ethan Bolker
Jul 21 at 20:25
Oh, let me go through them again then. I didn't manage to grasp the inution for why distinct eigenspaces imply independence while not the converse. While the solutions are algebraically elegant. I wanted to visualise something like how each of the n eigenspaces are restricted to a line, we should be able to say that all n Eigenvectors are linearly independent, since together they should span $F^n$?
– Aditya
Jul 21 at 20:40
add a comment |Â
1
Answering your own question is acceptable here. But that usually happens only if when you first asked you didn't know an answer, and you only found one after quite a while. Here you seem to have posted your answer right after asking. Why? If you are unsure of your proof you can put the proof in your question, flag the place where you have doubts, and use the proof-verification tag.
– Ethan Bolker
Jul 21 at 20:04
I only answered part 1, I actually wanted to answer this question - math.stackexchange.com/questions/29371/… using the method I gave above. That is why I posted half the solution. I hope to develop the answer later and provide a geometric intuition for part 2 of the question.
– Aditya
Jul 21 at 20:11
That question is seven years old and already has several really nice answers.
– Ethan Bolker
Jul 21 at 20:25
Oh, let me go through them again then. I didn't manage to grasp the inution for why distinct eigenspaces imply independence while not the converse. While the solutions are algebraically elegant. I wanted to visualise something like how each of the n eigenspaces are restricted to a line, we should be able to say that all n Eigenvectors are linearly independent, since together they should span $F^n$?
– Aditya
Jul 21 at 20:40
1
1
Answering your own question is acceptable here. But that usually happens only if when you first asked you didn't know an answer, and you only found one after quite a while. Here you seem to have posted your answer right after asking. Why? If you are unsure of your proof you can put the proof in your question, flag the place where you have doubts, and use the proof-verification tag.
– Ethan Bolker
Jul 21 at 20:04
Answering your own question is acceptable here. But that usually happens only if when you first asked you didn't know an answer, and you only found one after quite a while. Here you seem to have posted your answer right after asking. Why? If you are unsure of your proof you can put the proof in your question, flag the place where you have doubts, and use the proof-verification tag.
– Ethan Bolker
Jul 21 at 20:04
I only answered part 1, I actually wanted to answer this question - math.stackexchange.com/questions/29371/… using the method I gave above. That is why I posted half the solution. I hope to develop the answer later and provide a geometric intuition for part 2 of the question.
– Aditya
Jul 21 at 20:11
I only answered part 1, I actually wanted to answer this question - math.stackexchange.com/questions/29371/… using the method I gave above. That is why I posted half the solution. I hope to develop the answer later and provide a geometric intuition for part 2 of the question.
– Aditya
Jul 21 at 20:11
That question is seven years old and already has several really nice answers.
– Ethan Bolker
Jul 21 at 20:25
That question is seven years old and already has several really nice answers.
– Ethan Bolker
Jul 21 at 20:25
Oh, let me go through them again then. I didn't manage to grasp the inution for why distinct eigenspaces imply independence while not the converse. While the solutions are algebraically elegant. I wanted to visualise something like how each of the n eigenspaces are restricted to a line, we should be able to say that all n Eigenvectors are linearly independent, since together they should span $F^n$?
– Aditya
Jul 21 at 20:40
Oh, let me go through them again then. I didn't manage to grasp the inution for why distinct eigenspaces imply independence while not the converse. While the solutions are algebraically elegant. I wanted to visualise something like how each of the n eigenspaces are restricted to a line, we should be able to say that all n Eigenvectors are linearly independent, since together they should span $F^n$?
– Aditya
Jul 21 at 20:40
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2858839%2fprove-that-there-is-only-a-single-eigenvector-corresponding-to-each-of-the-disti%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password