Different eigenvalues of the same linear transformation according different bases
Clash Royale CLAN TAG#URR8PPP
up vote
1
down vote
favorite
I have a question about linear transformation and eigenvalues.
My question:
Given a linear transformation $T:R^3 to R^3$, And: $E$ is the standard basis of $R^3$, $B$ is another basis of $R^3$.
Let's denote: $A=[T]^B_B$ .
Let's assume that after gaussian elimination process on $A$ we get a matrix $M$ with one row of $0$'s, and now we calculate the eigenvalues of $M$.
- Are the eigenvalues of $M$ also the eigenvalues of the transformation $T?$
I think yes, because the eigenvalues don't change when you change basis, but the correct answer is no, can someone explain to me why?
By the way, Is it correct to say that we always must to work only with the standard basis of $R^3$ to find the eigenvalues of $T?$ (i.e the eigenvalues of $T$ are the roots of the caracteristic polynomial $P_A = Det(A- lambda cdot I)$ where $A=[T]^E_E$ , and $E$ is the standard basis)?
Thanks for help!
linear-algebra eigenvalues-eigenvectors
add a comment |Â
up vote
1
down vote
favorite
I have a question about linear transformation and eigenvalues.
My question:
Given a linear transformation $T:R^3 to R^3$, And: $E$ is the standard basis of $R^3$, $B$ is another basis of $R^3$.
Let's denote: $A=[T]^B_B$ .
Let's assume that after gaussian elimination process on $A$ we get a matrix $M$ with one row of $0$'s, and now we calculate the eigenvalues of $M$.
- Are the eigenvalues of $M$ also the eigenvalues of the transformation $T?$
I think yes, because the eigenvalues don't change when you change basis, but the correct answer is no, can someone explain to me why?
By the way, Is it correct to say that we always must to work only with the standard basis of $R^3$ to find the eigenvalues of $T?$ (i.e the eigenvalues of $T$ are the roots of the caracteristic polynomial $P_A = Det(A- lambda cdot I)$ where $A=[T]^E_E$ , and $E$ is the standard basis)?
Thanks for help!
linear-algebra eigenvalues-eigenvectors
2
Row operations will change the eigenvalues. Otherwise, every invertible matrix would only have the eigenvalue $1$, since they all can be row-reduced to the identity matrix.
â Theo Bendit
Jul 26 at 11:39
In gaussian elimination you change both bases to possibly different ones (e.g. $ M = [T]^C_D$, where $C neq D$).
â Stefan
Jul 26 at 11:40
add a comment |Â
up vote
1
down vote
favorite
up vote
1
down vote
favorite
I have a question about linear transformation and eigenvalues.
My question:
Given a linear transformation $T:R^3 to R^3$, And: $E$ is the standard basis of $R^3$, $B$ is another basis of $R^3$.
Let's denote: $A=[T]^B_B$ .
Let's assume that after gaussian elimination process on $A$ we get a matrix $M$ with one row of $0$'s, and now we calculate the eigenvalues of $M$.
- Are the eigenvalues of $M$ also the eigenvalues of the transformation $T?$
I think yes, because the eigenvalues don't change when you change basis, but the correct answer is no, can someone explain to me why?
By the way, Is it correct to say that we always must to work only with the standard basis of $R^3$ to find the eigenvalues of $T?$ (i.e the eigenvalues of $T$ are the roots of the caracteristic polynomial $P_A = Det(A- lambda cdot I)$ where $A=[T]^E_E$ , and $E$ is the standard basis)?
Thanks for help!
linear-algebra eigenvalues-eigenvectors
I have a question about linear transformation and eigenvalues.
My question:
Given a linear transformation $T:R^3 to R^3$, And: $E$ is the standard basis of $R^3$, $B$ is another basis of $R^3$.
Let's denote: $A=[T]^B_B$ .
Let's assume that after gaussian elimination process on $A$ we get a matrix $M$ with one row of $0$'s, and now we calculate the eigenvalues of $M$.
- Are the eigenvalues of $M$ also the eigenvalues of the transformation $T?$
I think yes, because the eigenvalues don't change when you change basis, but the correct answer is no, can someone explain to me why?
By the way, Is it correct to say that we always must to work only with the standard basis of $R^3$ to find the eigenvalues of $T?$ (i.e the eigenvalues of $T$ are the roots of the caracteristic polynomial $P_A = Det(A- lambda cdot I)$ where $A=[T]^E_E$ , and $E$ is the standard basis)?
Thanks for help!
linear-algebra eigenvalues-eigenvectors
edited Jul 26 at 11:40
Bernard
110k635102
110k635102
asked Jul 26 at 11:30
D.Rotnemer
11216
11216
2
Row operations will change the eigenvalues. Otherwise, every invertible matrix would only have the eigenvalue $1$, since they all can be row-reduced to the identity matrix.
â Theo Bendit
Jul 26 at 11:39
In gaussian elimination you change both bases to possibly different ones (e.g. $ M = [T]^C_D$, where $C neq D$).
â Stefan
Jul 26 at 11:40
add a comment |Â
2
Row operations will change the eigenvalues. Otherwise, every invertible matrix would only have the eigenvalue $1$, since they all can be row-reduced to the identity matrix.
â Theo Bendit
Jul 26 at 11:39
In gaussian elimination you change both bases to possibly different ones (e.g. $ M = [T]^C_D$, where $C neq D$).
â Stefan
Jul 26 at 11:40
2
2
Row operations will change the eigenvalues. Otherwise, every invertible matrix would only have the eigenvalue $1$, since they all can be row-reduced to the identity matrix.
â Theo Bendit
Jul 26 at 11:39
Row operations will change the eigenvalues. Otherwise, every invertible matrix would only have the eigenvalue $1$, since they all can be row-reduced to the identity matrix.
â Theo Bendit
Jul 26 at 11:39
In gaussian elimination you change both bases to possibly different ones (e.g. $ M = [T]^C_D$, where $C neq D$).
â Stefan
Jul 26 at 11:40
In gaussian elimination you change both bases to possibly different ones (e.g. $ M = [T]^C_D$, where $C neq D$).
â Stefan
Jul 26 at 11:40
add a comment |Â
1 Answer
1
active
oldest
votes
up vote
1
down vote
accepted
For your first question, I want to cite @Theo Bendit's comment and generally add: Gaussian elimination changes many properties of matrices if you're not careful with it, including eigenvalues and e.g. the determinant(at least with in the general way).
For your second question: No, the eigenvalue of the operator $T$ does not depend on the basis chosen, if you calculate the roots of $p_A$ for $A$ being a corresponding representation:
Let $A$ and $B$ be similar, that is they represent the same endomorphism w.r.t. different bases, i.e. $B=CAC^-1$ for some invertible $C$(which you might call the change-of-basis matrix). Then
$$B-xI=CAC^-1-xI=CAC^-1-xCIC^-1=C(A-xI)C^-1$$
Thus, as the determinant distributes over matrix multiplication, we have
$$p_B=mathrmdet(B-xI)=mathrmdet(C(A-xI)C^-1)=mathrmdet(C)cdotmathrmdet(A-xI)cdotmathrmdet(C^-1)=mathrmdet(C)cdotmathrmdet(A-xI)cdotmathrmdet(C)^-1=mathrmdet(A-xI)=p_A$$
The last steps follow from the elementary property of determinants for invertible matrices that $mathrmdet(C^-1)=mathrmdet(C)^-1$.
EDIT: Note that is makes thus sense to define the characteristic polynomial for an endomorphism, i.e. defining $p_T$ as it made sense to define the determinant for endomorphisms instead of only matrices.
add a comment |Â
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
accepted
For your first question, I want to cite @Theo Bendit's comment and generally add: Gaussian elimination changes many properties of matrices if you're not careful with it, including eigenvalues and e.g. the determinant(at least with in the general way).
For your second question: No, the eigenvalue of the operator $T$ does not depend on the basis chosen, if you calculate the roots of $p_A$ for $A$ being a corresponding representation:
Let $A$ and $B$ be similar, that is they represent the same endomorphism w.r.t. different bases, i.e. $B=CAC^-1$ for some invertible $C$(which you might call the change-of-basis matrix). Then
$$B-xI=CAC^-1-xI=CAC^-1-xCIC^-1=C(A-xI)C^-1$$
Thus, as the determinant distributes over matrix multiplication, we have
$$p_B=mathrmdet(B-xI)=mathrmdet(C(A-xI)C^-1)=mathrmdet(C)cdotmathrmdet(A-xI)cdotmathrmdet(C^-1)=mathrmdet(C)cdotmathrmdet(A-xI)cdotmathrmdet(C)^-1=mathrmdet(A-xI)=p_A$$
The last steps follow from the elementary property of determinants for invertible matrices that $mathrmdet(C^-1)=mathrmdet(C)^-1$.
EDIT: Note that is makes thus sense to define the characteristic polynomial for an endomorphism, i.e. defining $p_T$ as it made sense to define the determinant for endomorphisms instead of only matrices.
add a comment |Â
up vote
1
down vote
accepted
For your first question, I want to cite @Theo Bendit's comment and generally add: Gaussian elimination changes many properties of matrices if you're not careful with it, including eigenvalues and e.g. the determinant(at least with in the general way).
For your second question: No, the eigenvalue of the operator $T$ does not depend on the basis chosen, if you calculate the roots of $p_A$ for $A$ being a corresponding representation:
Let $A$ and $B$ be similar, that is they represent the same endomorphism w.r.t. different bases, i.e. $B=CAC^-1$ for some invertible $C$(which you might call the change-of-basis matrix). Then
$$B-xI=CAC^-1-xI=CAC^-1-xCIC^-1=C(A-xI)C^-1$$
Thus, as the determinant distributes over matrix multiplication, we have
$$p_B=mathrmdet(B-xI)=mathrmdet(C(A-xI)C^-1)=mathrmdet(C)cdotmathrmdet(A-xI)cdotmathrmdet(C^-1)=mathrmdet(C)cdotmathrmdet(A-xI)cdotmathrmdet(C)^-1=mathrmdet(A-xI)=p_A$$
The last steps follow from the elementary property of determinants for invertible matrices that $mathrmdet(C^-1)=mathrmdet(C)^-1$.
EDIT: Note that is makes thus sense to define the characteristic polynomial for an endomorphism, i.e. defining $p_T$ as it made sense to define the determinant for endomorphisms instead of only matrices.
add a comment |Â
up vote
1
down vote
accepted
up vote
1
down vote
accepted
For your first question, I want to cite @Theo Bendit's comment and generally add: Gaussian elimination changes many properties of matrices if you're not careful with it, including eigenvalues and e.g. the determinant(at least with in the general way).
For your second question: No, the eigenvalue of the operator $T$ does not depend on the basis chosen, if you calculate the roots of $p_A$ for $A$ being a corresponding representation:
Let $A$ and $B$ be similar, that is they represent the same endomorphism w.r.t. different bases, i.e. $B=CAC^-1$ for some invertible $C$(which you might call the change-of-basis matrix). Then
$$B-xI=CAC^-1-xI=CAC^-1-xCIC^-1=C(A-xI)C^-1$$
Thus, as the determinant distributes over matrix multiplication, we have
$$p_B=mathrmdet(B-xI)=mathrmdet(C(A-xI)C^-1)=mathrmdet(C)cdotmathrmdet(A-xI)cdotmathrmdet(C^-1)=mathrmdet(C)cdotmathrmdet(A-xI)cdotmathrmdet(C)^-1=mathrmdet(A-xI)=p_A$$
The last steps follow from the elementary property of determinants for invertible matrices that $mathrmdet(C^-1)=mathrmdet(C)^-1$.
EDIT: Note that is makes thus sense to define the characteristic polynomial for an endomorphism, i.e. defining $p_T$ as it made sense to define the determinant for endomorphisms instead of only matrices.
For your first question, I want to cite @Theo Bendit's comment and generally add: Gaussian elimination changes many properties of matrices if you're not careful with it, including eigenvalues and e.g. the determinant(at least with in the general way).
For your second question: No, the eigenvalue of the operator $T$ does not depend on the basis chosen, if you calculate the roots of $p_A$ for $A$ being a corresponding representation:
Let $A$ and $B$ be similar, that is they represent the same endomorphism w.r.t. different bases, i.e. $B=CAC^-1$ for some invertible $C$(which you might call the change-of-basis matrix). Then
$$B-xI=CAC^-1-xI=CAC^-1-xCIC^-1=C(A-xI)C^-1$$
Thus, as the determinant distributes over matrix multiplication, we have
$$p_B=mathrmdet(B-xI)=mathrmdet(C(A-xI)C^-1)=mathrmdet(C)cdotmathrmdet(A-xI)cdotmathrmdet(C^-1)=mathrmdet(C)cdotmathrmdet(A-xI)cdotmathrmdet(C)^-1=mathrmdet(A-xI)=p_A$$
The last steps follow from the elementary property of determinants for invertible matrices that $mathrmdet(C^-1)=mathrmdet(C)^-1$.
EDIT: Note that is makes thus sense to define the characteristic polynomial for an endomorphism, i.e. defining $p_T$ as it made sense to define the determinant for endomorphisms instead of only matrices.
edited Jul 26 at 20:41
answered Jul 26 at 20:12
zzuussee
1,506419
1,506419
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2863332%2fdifferent-eigenvalues-of-the-same-linear-transformation-according-different-base%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
2
Row operations will change the eigenvalues. Otherwise, every invertible matrix would only have the eigenvalue $1$, since they all can be row-reduced to the identity matrix.
â Theo Bendit
Jul 26 at 11:39
In gaussian elimination you change both bases to possibly different ones (e.g. $ M = [T]^C_D$, where $C neq D$).
â Stefan
Jul 26 at 11:40