difference between equivalent matrices and similar matrices.
Clash Royale CLAN TAG#URR8PPP
up vote
2
down vote
favorite
We say that $A$ and $B$ are equivalent if there are $P$ and $Q$ invertible s.t. $$A=PFQ^-1.$$
We say that they are similar if there is $P$ invertible s.t. $$A=PBP^-1.$$
In wikipedia, they say that two matrix are equivalent if the represent the same linear application $f:Vto W$ for two couple of different bases whereas, they are similar if they represent the same linear application compared to two chosen basis.
Q1) I don't really understand the subtlety. Could someone explain with an example ?
Q2) By the where, is there are criterion to show that to matrices are similar ? For equivalent it's enough to prove that they have same rank. But how would I do for similar ?
linear-algebra matrices
add a comment |Â
up vote
2
down vote
favorite
We say that $A$ and $B$ are equivalent if there are $P$ and $Q$ invertible s.t. $$A=PFQ^-1.$$
We say that they are similar if there is $P$ invertible s.t. $$A=PBP^-1.$$
In wikipedia, they say that two matrix are equivalent if the represent the same linear application $f:Vto W$ for two couple of different bases whereas, they are similar if they represent the same linear application compared to two chosen basis.
Q1) I don't really understand the subtlety. Could someone explain with an example ?
Q2) By the where, is there are criterion to show that to matrices are similar ? For equivalent it's enough to prove that they have same rank. But how would I do for similar ?
linear-algebra matrices
You missed one important detail: the "similar" case should talk only of linear operators (a.k.a. endomorphisms), i.e., linear maps from a space to itself. That is the key difference here: for these beasts there is only one basis to be chosen.
â Marc van Leeuwen
Jul 25 at 15:23
@MarcvanLeeuwen: Does it mean that if endomorphism are equivalent, then they are similar ?
â user386627
Jul 25 at 15:34
@user386627: Of course not. See my example in my answer.
â Surb
Jul 25 at 15:43
No it is the other way around: if they are similar, then certainly they are equivalent. But if you are using a (square) matrix to encode a linear operator, then there is not much point in considering the relation of equivalence at all, similarity is what you should care about then.
â Marc van Leeuwen
Jul 25 at 15:45
add a comment |Â
up vote
2
down vote
favorite
up vote
2
down vote
favorite
We say that $A$ and $B$ are equivalent if there are $P$ and $Q$ invertible s.t. $$A=PFQ^-1.$$
We say that they are similar if there is $P$ invertible s.t. $$A=PBP^-1.$$
In wikipedia, they say that two matrix are equivalent if the represent the same linear application $f:Vto W$ for two couple of different bases whereas, they are similar if they represent the same linear application compared to two chosen basis.
Q1) I don't really understand the subtlety. Could someone explain with an example ?
Q2) By the where, is there are criterion to show that to matrices are similar ? For equivalent it's enough to prove that they have same rank. But how would I do for similar ?
linear-algebra matrices
We say that $A$ and $B$ are equivalent if there are $P$ and $Q$ invertible s.t. $$A=PFQ^-1.$$
We say that they are similar if there is $P$ invertible s.t. $$A=PBP^-1.$$
In wikipedia, they say that two matrix are equivalent if the represent the same linear application $f:Vto W$ for two couple of different bases whereas, they are similar if they represent the same linear application compared to two chosen basis.
Q1) I don't really understand the subtlety. Could someone explain with an example ?
Q2) By the where, is there are criterion to show that to matrices are similar ? For equivalent it's enough to prove that they have same rank. But how would I do for similar ?
linear-algebra matrices
edited Jul 25 at 15:35
asked Jul 25 at 15:01
user386627
714214
714214
You missed one important detail: the "similar" case should talk only of linear operators (a.k.a. endomorphisms), i.e., linear maps from a space to itself. That is the key difference here: for these beasts there is only one basis to be chosen.
â Marc van Leeuwen
Jul 25 at 15:23
@MarcvanLeeuwen: Does it mean that if endomorphism are equivalent, then they are similar ?
â user386627
Jul 25 at 15:34
@user386627: Of course not. See my example in my answer.
â Surb
Jul 25 at 15:43
No it is the other way around: if they are similar, then certainly they are equivalent. But if you are using a (square) matrix to encode a linear operator, then there is not much point in considering the relation of equivalence at all, similarity is what you should care about then.
â Marc van Leeuwen
Jul 25 at 15:45
add a comment |Â
You missed one important detail: the "similar" case should talk only of linear operators (a.k.a. endomorphisms), i.e., linear maps from a space to itself. That is the key difference here: for these beasts there is only one basis to be chosen.
â Marc van Leeuwen
Jul 25 at 15:23
@MarcvanLeeuwen: Does it mean that if endomorphism are equivalent, then they are similar ?
â user386627
Jul 25 at 15:34
@user386627: Of course not. See my example in my answer.
â Surb
Jul 25 at 15:43
No it is the other way around: if they are similar, then certainly they are equivalent. But if you are using a (square) matrix to encode a linear operator, then there is not much point in considering the relation of equivalence at all, similarity is what you should care about then.
â Marc van Leeuwen
Jul 25 at 15:45
You missed one important detail: the "similar" case should talk only of linear operators (a.k.a. endomorphisms), i.e., linear maps from a space to itself. That is the key difference here: for these beasts there is only one basis to be chosen.
â Marc van Leeuwen
Jul 25 at 15:23
You missed one important detail: the "similar" case should talk only of linear operators (a.k.a. endomorphisms), i.e., linear maps from a space to itself. That is the key difference here: for these beasts there is only one basis to be chosen.
â Marc van Leeuwen
Jul 25 at 15:23
@MarcvanLeeuwen: Does it mean that if endomorphism are equivalent, then they are similar ?
â user386627
Jul 25 at 15:34
@MarcvanLeeuwen: Does it mean that if endomorphism are equivalent, then they are similar ?
â user386627
Jul 25 at 15:34
@user386627: Of course not. See my example in my answer.
â Surb
Jul 25 at 15:43
@user386627: Of course not. See my example in my answer.
â Surb
Jul 25 at 15:43
No it is the other way around: if they are similar, then certainly they are equivalent. But if you are using a (square) matrix to encode a linear operator, then there is not much point in considering the relation of equivalence at all, similarity is what you should care about then.
â Marc van Leeuwen
Jul 25 at 15:45
No it is the other way around: if they are similar, then certainly they are equivalent. But if you are using a (square) matrix to encode a linear operator, then there is not much point in considering the relation of equivalence at all, similarity is what you should care about then.
â Marc van Leeuwen
Jul 25 at 15:45
add a comment |Â
2 Answers
2
active
oldest
votes
up vote
0
down vote
Two matrices $A$ and $B$ are equivalent if there are two bases $mathcal B,mathcal B'$ of $V$ and two basis $mathcal F,mathcal F'$ of $W$ and a linear application $f:Vlongrightarrow W$ s.t. $$(f)_mathcal Fmathcal B=Aquad textandquad (f)_mathcal F'mathcal B'=B.$$
Two matrices $A$ and $B$ are similar if there are two basis $mathcal B,mathcal B'$ of $V$ an endomorhism $f:Vlongrightarrow V$ s.t. $$A=(f)_mathcal Bmathcal Bquad textandquad B=(f)_mathcal B'mathcal B'.$$
For example, if we are in $mathbb R^3$, a rotation of angle $theta$ around the axis $Ox$ and a rotation of angle $theta $ around the axis $Oy$ are similar. Now, a rotation and a Homothetie are equivalents but of course not similar.
add a comment |Â
up vote
0
down vote
Equivalence of matrices: If you have a linear map $fcolon Vto W$ with transformation matrix $AinmathbbR^mtimes n$, you can always choose basis $mathcalB$ and $mathcalC$ of $V$ and $W$, respectively, such that the transformation matrix $F$ of $f$ for this choice has only entries on the diagonal, and all entries are $0$ or $1$. You first choose a basis $b_1,ldots,b_l$ of the kernel of $f$ and then expand it to a basis $b_1,ldots,b_n$ of $V$. Take the images of the vectors $b_l+1,ldots,b_n$, they are linearly independent in $W$. If you expand them to a basis $c_1,ldots,c_m$ of $W$, you get the transformation matrix $F$.
This matrix $F$ is nothing else then the reduced row echelon of $A$. You obtain $F$ from $A$ via $A=PFQ^-1$, where $P$ and $Q$ are the change of basis coming from $mathcalB$ and $mathcalC$. In fact, multiplying invertible matrices from the left and right is nothing else than a gaussian elimination, since you can write the matrices as a product of elementary matrices, which correspond to primitve elimination steps.
Since the shape of $F$ only depends on the size of the kernel/rank, this is the only thing you have to check.
Similarity of matrices: From the definition alone, you can see that $A$ and $B$ have to be square-matrices now. So it's sensible to think about $A$ as corresponding to an endomorphism $fcolon Vto V$. Multiplying the same matrix on the left and right means that you do the same change of basis, once in the one direction, once in the other. So we are only choosing one basis for both sides. This restricts our freedom of action, but also preserves more properties of the matrix $A$. Where the transformations above only preserved $textrank(A)$, now we get $det(A)=det(B)$, $texttrace(A)=texttrace(B)$ and the Eigenvalues of $A$ and $B$ coincide. To check whether two matrices are similar, you can calculate their respective Jordan normal form. Every square matrix is similar to its Jordan normal form and the JNF is unique. So if two matrices are similar, they have to have the same JNF.
Example: Let's look at the linear endomorphism $fcolon mathcalP_2 to mathcalP_2, p(X) mapsto p(X)+p(X+1)$, mapping polynomials with real coefficients of degree $leq 2$. If we look at the monomial basis of $mathcalP_2$ which is $mathcalM=,1,X,X^2,$, we obtain the transformation matrix
$$A=M_mathcalM^mathcalM(f) = beginpmatrix2&1&1\0&2&2\0&0&2endpmatrix$$
Let's find another basis such that the transformation matrix is the echelon form. We set $mathcalC=,2,2X+1, 2X^2+2X+1,$. This is obviously a basis of $mathcalP_2$, so we can look at the corresponding transformation matrix:
$$F=M_mathcalC^mathcalB(f) = beginpmatrix1&0&0\0&1&0\0&0&1endpmatrix$$
$F$ and $A$ are equivalent. Take
$$Q=I_3text (the identity matrix), qquad P=frac14beginpmatrix2&-1&0\0&2&-2\0&0&2endpmatrix text (which is A^-1text)$$
Now set $mathcalB=, 2,2X+1,X^2 ,$. We obtain the corresponding transformation matrix:
$$B=M_mathcalB^mathcal B(f) = beginpmatrix2&1&0\0&2&1\0&0&2endpmatrix$$
which is the Jordan normal form of $A$ (or $f$). $B$ and $A$ are equivalent (both have rank $3$) and similar. But $F$ and $A$ are not similar ($A$ has $texttrace(A)=6$, where $texttrace(F)=3$).
add a comment |Â
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
0
down vote
Two matrices $A$ and $B$ are equivalent if there are two bases $mathcal B,mathcal B'$ of $V$ and two basis $mathcal F,mathcal F'$ of $W$ and a linear application $f:Vlongrightarrow W$ s.t. $$(f)_mathcal Fmathcal B=Aquad textandquad (f)_mathcal F'mathcal B'=B.$$
Two matrices $A$ and $B$ are similar if there are two basis $mathcal B,mathcal B'$ of $V$ an endomorhism $f:Vlongrightarrow V$ s.t. $$A=(f)_mathcal Bmathcal Bquad textandquad B=(f)_mathcal B'mathcal B'.$$
For example, if we are in $mathbb R^3$, a rotation of angle $theta$ around the axis $Ox$ and a rotation of angle $theta $ around the axis $Oy$ are similar. Now, a rotation and a Homothetie are equivalents but of course not similar.
add a comment |Â
up vote
0
down vote
Two matrices $A$ and $B$ are equivalent if there are two bases $mathcal B,mathcal B'$ of $V$ and two basis $mathcal F,mathcal F'$ of $W$ and a linear application $f:Vlongrightarrow W$ s.t. $$(f)_mathcal Fmathcal B=Aquad textandquad (f)_mathcal F'mathcal B'=B.$$
Two matrices $A$ and $B$ are similar if there are two basis $mathcal B,mathcal B'$ of $V$ an endomorhism $f:Vlongrightarrow V$ s.t. $$A=(f)_mathcal Bmathcal Bquad textandquad B=(f)_mathcal B'mathcal B'.$$
For example, if we are in $mathbb R^3$, a rotation of angle $theta$ around the axis $Ox$ and a rotation of angle $theta $ around the axis $Oy$ are similar. Now, a rotation and a Homothetie are equivalents but of course not similar.
add a comment |Â
up vote
0
down vote
up vote
0
down vote
Two matrices $A$ and $B$ are equivalent if there are two bases $mathcal B,mathcal B'$ of $V$ and two basis $mathcal F,mathcal F'$ of $W$ and a linear application $f:Vlongrightarrow W$ s.t. $$(f)_mathcal Fmathcal B=Aquad textandquad (f)_mathcal F'mathcal B'=B.$$
Two matrices $A$ and $B$ are similar if there are two basis $mathcal B,mathcal B'$ of $V$ an endomorhism $f:Vlongrightarrow V$ s.t. $$A=(f)_mathcal Bmathcal Bquad textandquad B=(f)_mathcal B'mathcal B'.$$
For example, if we are in $mathbb R^3$, a rotation of angle $theta$ around the axis $Ox$ and a rotation of angle $theta $ around the axis $Oy$ are similar. Now, a rotation and a Homothetie are equivalents but of course not similar.
Two matrices $A$ and $B$ are equivalent if there are two bases $mathcal B,mathcal B'$ of $V$ and two basis $mathcal F,mathcal F'$ of $W$ and a linear application $f:Vlongrightarrow W$ s.t. $$(f)_mathcal Fmathcal B=Aquad textandquad (f)_mathcal F'mathcal B'=B.$$
Two matrices $A$ and $B$ are similar if there are two basis $mathcal B,mathcal B'$ of $V$ an endomorhism $f:Vlongrightarrow V$ s.t. $$A=(f)_mathcal Bmathcal Bquad textandquad B=(f)_mathcal B'mathcal B'.$$
For example, if we are in $mathbb R^3$, a rotation of angle $theta$ around the axis $Ox$ and a rotation of angle $theta $ around the axis $Oy$ are similar. Now, a rotation and a Homothetie are equivalents but of course not similar.
edited Jul 25 at 15:42
answered Jul 25 at 15:14
Surb
36.3k84274
36.3k84274
add a comment |Â
add a comment |Â
up vote
0
down vote
Equivalence of matrices: If you have a linear map $fcolon Vto W$ with transformation matrix $AinmathbbR^mtimes n$, you can always choose basis $mathcalB$ and $mathcalC$ of $V$ and $W$, respectively, such that the transformation matrix $F$ of $f$ for this choice has only entries on the diagonal, and all entries are $0$ or $1$. You first choose a basis $b_1,ldots,b_l$ of the kernel of $f$ and then expand it to a basis $b_1,ldots,b_n$ of $V$. Take the images of the vectors $b_l+1,ldots,b_n$, they are linearly independent in $W$. If you expand them to a basis $c_1,ldots,c_m$ of $W$, you get the transformation matrix $F$.
This matrix $F$ is nothing else then the reduced row echelon of $A$. You obtain $F$ from $A$ via $A=PFQ^-1$, where $P$ and $Q$ are the change of basis coming from $mathcalB$ and $mathcalC$. In fact, multiplying invertible matrices from the left and right is nothing else than a gaussian elimination, since you can write the matrices as a product of elementary matrices, which correspond to primitve elimination steps.
Since the shape of $F$ only depends on the size of the kernel/rank, this is the only thing you have to check.
Similarity of matrices: From the definition alone, you can see that $A$ and $B$ have to be square-matrices now. So it's sensible to think about $A$ as corresponding to an endomorphism $fcolon Vto V$. Multiplying the same matrix on the left and right means that you do the same change of basis, once in the one direction, once in the other. So we are only choosing one basis for both sides. This restricts our freedom of action, but also preserves more properties of the matrix $A$. Where the transformations above only preserved $textrank(A)$, now we get $det(A)=det(B)$, $texttrace(A)=texttrace(B)$ and the Eigenvalues of $A$ and $B$ coincide. To check whether two matrices are similar, you can calculate their respective Jordan normal form. Every square matrix is similar to its Jordan normal form and the JNF is unique. So if two matrices are similar, they have to have the same JNF.
Example: Let's look at the linear endomorphism $fcolon mathcalP_2 to mathcalP_2, p(X) mapsto p(X)+p(X+1)$, mapping polynomials with real coefficients of degree $leq 2$. If we look at the monomial basis of $mathcalP_2$ which is $mathcalM=,1,X,X^2,$, we obtain the transformation matrix
$$A=M_mathcalM^mathcalM(f) = beginpmatrix2&1&1\0&2&2\0&0&2endpmatrix$$
Let's find another basis such that the transformation matrix is the echelon form. We set $mathcalC=,2,2X+1, 2X^2+2X+1,$. This is obviously a basis of $mathcalP_2$, so we can look at the corresponding transformation matrix:
$$F=M_mathcalC^mathcalB(f) = beginpmatrix1&0&0\0&1&0\0&0&1endpmatrix$$
$F$ and $A$ are equivalent. Take
$$Q=I_3text (the identity matrix), qquad P=frac14beginpmatrix2&-1&0\0&2&-2\0&0&2endpmatrix text (which is A^-1text)$$
Now set $mathcalB=, 2,2X+1,X^2 ,$. We obtain the corresponding transformation matrix:
$$B=M_mathcalB^mathcal B(f) = beginpmatrix2&1&0\0&2&1\0&0&2endpmatrix$$
which is the Jordan normal form of $A$ (or $f$). $B$ and $A$ are equivalent (both have rank $3$) and similar. But $F$ and $A$ are not similar ($A$ has $texttrace(A)=6$, where $texttrace(F)=3$).
add a comment |Â
up vote
0
down vote
Equivalence of matrices: If you have a linear map $fcolon Vto W$ with transformation matrix $AinmathbbR^mtimes n$, you can always choose basis $mathcalB$ and $mathcalC$ of $V$ and $W$, respectively, such that the transformation matrix $F$ of $f$ for this choice has only entries on the diagonal, and all entries are $0$ or $1$. You first choose a basis $b_1,ldots,b_l$ of the kernel of $f$ and then expand it to a basis $b_1,ldots,b_n$ of $V$. Take the images of the vectors $b_l+1,ldots,b_n$, they are linearly independent in $W$. If you expand them to a basis $c_1,ldots,c_m$ of $W$, you get the transformation matrix $F$.
This matrix $F$ is nothing else then the reduced row echelon of $A$. You obtain $F$ from $A$ via $A=PFQ^-1$, where $P$ and $Q$ are the change of basis coming from $mathcalB$ and $mathcalC$. In fact, multiplying invertible matrices from the left and right is nothing else than a gaussian elimination, since you can write the matrices as a product of elementary matrices, which correspond to primitve elimination steps.
Since the shape of $F$ only depends on the size of the kernel/rank, this is the only thing you have to check.
Similarity of matrices: From the definition alone, you can see that $A$ and $B$ have to be square-matrices now. So it's sensible to think about $A$ as corresponding to an endomorphism $fcolon Vto V$. Multiplying the same matrix on the left and right means that you do the same change of basis, once in the one direction, once in the other. So we are only choosing one basis for both sides. This restricts our freedom of action, but also preserves more properties of the matrix $A$. Where the transformations above only preserved $textrank(A)$, now we get $det(A)=det(B)$, $texttrace(A)=texttrace(B)$ and the Eigenvalues of $A$ and $B$ coincide. To check whether two matrices are similar, you can calculate their respective Jordan normal form. Every square matrix is similar to its Jordan normal form and the JNF is unique. So if two matrices are similar, they have to have the same JNF.
Example: Let's look at the linear endomorphism $fcolon mathcalP_2 to mathcalP_2, p(X) mapsto p(X)+p(X+1)$, mapping polynomials with real coefficients of degree $leq 2$. If we look at the monomial basis of $mathcalP_2$ which is $mathcalM=,1,X,X^2,$, we obtain the transformation matrix
$$A=M_mathcalM^mathcalM(f) = beginpmatrix2&1&1\0&2&2\0&0&2endpmatrix$$
Let's find another basis such that the transformation matrix is the echelon form. We set $mathcalC=,2,2X+1, 2X^2+2X+1,$. This is obviously a basis of $mathcalP_2$, so we can look at the corresponding transformation matrix:
$$F=M_mathcalC^mathcalB(f) = beginpmatrix1&0&0\0&1&0\0&0&1endpmatrix$$
$F$ and $A$ are equivalent. Take
$$Q=I_3text (the identity matrix), qquad P=frac14beginpmatrix2&-1&0\0&2&-2\0&0&2endpmatrix text (which is A^-1text)$$
Now set $mathcalB=, 2,2X+1,X^2 ,$. We obtain the corresponding transformation matrix:
$$B=M_mathcalB^mathcal B(f) = beginpmatrix2&1&0\0&2&1\0&0&2endpmatrix$$
which is the Jordan normal form of $A$ (or $f$). $B$ and $A$ are equivalent (both have rank $3$) and similar. But $F$ and $A$ are not similar ($A$ has $texttrace(A)=6$, where $texttrace(F)=3$).
add a comment |Â
up vote
0
down vote
up vote
0
down vote
Equivalence of matrices: If you have a linear map $fcolon Vto W$ with transformation matrix $AinmathbbR^mtimes n$, you can always choose basis $mathcalB$ and $mathcalC$ of $V$ and $W$, respectively, such that the transformation matrix $F$ of $f$ for this choice has only entries on the diagonal, and all entries are $0$ or $1$. You first choose a basis $b_1,ldots,b_l$ of the kernel of $f$ and then expand it to a basis $b_1,ldots,b_n$ of $V$. Take the images of the vectors $b_l+1,ldots,b_n$, they are linearly independent in $W$. If you expand them to a basis $c_1,ldots,c_m$ of $W$, you get the transformation matrix $F$.
This matrix $F$ is nothing else then the reduced row echelon of $A$. You obtain $F$ from $A$ via $A=PFQ^-1$, where $P$ and $Q$ are the change of basis coming from $mathcalB$ and $mathcalC$. In fact, multiplying invertible matrices from the left and right is nothing else than a gaussian elimination, since you can write the matrices as a product of elementary matrices, which correspond to primitve elimination steps.
Since the shape of $F$ only depends on the size of the kernel/rank, this is the only thing you have to check.
Similarity of matrices: From the definition alone, you can see that $A$ and $B$ have to be square-matrices now. So it's sensible to think about $A$ as corresponding to an endomorphism $fcolon Vto V$. Multiplying the same matrix on the left and right means that you do the same change of basis, once in the one direction, once in the other. So we are only choosing one basis for both sides. This restricts our freedom of action, but also preserves more properties of the matrix $A$. Where the transformations above only preserved $textrank(A)$, now we get $det(A)=det(B)$, $texttrace(A)=texttrace(B)$ and the Eigenvalues of $A$ and $B$ coincide. To check whether two matrices are similar, you can calculate their respective Jordan normal form. Every square matrix is similar to its Jordan normal form and the JNF is unique. So if two matrices are similar, they have to have the same JNF.
Example: Let's look at the linear endomorphism $fcolon mathcalP_2 to mathcalP_2, p(X) mapsto p(X)+p(X+1)$, mapping polynomials with real coefficients of degree $leq 2$. If we look at the monomial basis of $mathcalP_2$ which is $mathcalM=,1,X,X^2,$, we obtain the transformation matrix
$$A=M_mathcalM^mathcalM(f) = beginpmatrix2&1&1\0&2&2\0&0&2endpmatrix$$
Let's find another basis such that the transformation matrix is the echelon form. We set $mathcalC=,2,2X+1, 2X^2+2X+1,$. This is obviously a basis of $mathcalP_2$, so we can look at the corresponding transformation matrix:
$$F=M_mathcalC^mathcalB(f) = beginpmatrix1&0&0\0&1&0\0&0&1endpmatrix$$
$F$ and $A$ are equivalent. Take
$$Q=I_3text (the identity matrix), qquad P=frac14beginpmatrix2&-1&0\0&2&-2\0&0&2endpmatrix text (which is A^-1text)$$
Now set $mathcalB=, 2,2X+1,X^2 ,$. We obtain the corresponding transformation matrix:
$$B=M_mathcalB^mathcal B(f) = beginpmatrix2&1&0\0&2&1\0&0&2endpmatrix$$
which is the Jordan normal form of $A$ (or $f$). $B$ and $A$ are equivalent (both have rank $3$) and similar. But $F$ and $A$ are not similar ($A$ has $texttrace(A)=6$, where $texttrace(F)=3$).
Equivalence of matrices: If you have a linear map $fcolon Vto W$ with transformation matrix $AinmathbbR^mtimes n$, you can always choose basis $mathcalB$ and $mathcalC$ of $V$ and $W$, respectively, such that the transformation matrix $F$ of $f$ for this choice has only entries on the diagonal, and all entries are $0$ or $1$. You first choose a basis $b_1,ldots,b_l$ of the kernel of $f$ and then expand it to a basis $b_1,ldots,b_n$ of $V$. Take the images of the vectors $b_l+1,ldots,b_n$, they are linearly independent in $W$. If you expand them to a basis $c_1,ldots,c_m$ of $W$, you get the transformation matrix $F$.
This matrix $F$ is nothing else then the reduced row echelon of $A$. You obtain $F$ from $A$ via $A=PFQ^-1$, where $P$ and $Q$ are the change of basis coming from $mathcalB$ and $mathcalC$. In fact, multiplying invertible matrices from the left and right is nothing else than a gaussian elimination, since you can write the matrices as a product of elementary matrices, which correspond to primitve elimination steps.
Since the shape of $F$ only depends on the size of the kernel/rank, this is the only thing you have to check.
Similarity of matrices: From the definition alone, you can see that $A$ and $B$ have to be square-matrices now. So it's sensible to think about $A$ as corresponding to an endomorphism $fcolon Vto V$. Multiplying the same matrix on the left and right means that you do the same change of basis, once in the one direction, once in the other. So we are only choosing one basis for both sides. This restricts our freedom of action, but also preserves more properties of the matrix $A$. Where the transformations above only preserved $textrank(A)$, now we get $det(A)=det(B)$, $texttrace(A)=texttrace(B)$ and the Eigenvalues of $A$ and $B$ coincide. To check whether two matrices are similar, you can calculate their respective Jordan normal form. Every square matrix is similar to its Jordan normal form and the JNF is unique. So if two matrices are similar, they have to have the same JNF.
Example: Let's look at the linear endomorphism $fcolon mathcalP_2 to mathcalP_2, p(X) mapsto p(X)+p(X+1)$, mapping polynomials with real coefficients of degree $leq 2$. If we look at the monomial basis of $mathcalP_2$ which is $mathcalM=,1,X,X^2,$, we obtain the transformation matrix
$$A=M_mathcalM^mathcalM(f) = beginpmatrix2&1&1\0&2&2\0&0&2endpmatrix$$
Let's find another basis such that the transformation matrix is the echelon form. We set $mathcalC=,2,2X+1, 2X^2+2X+1,$. This is obviously a basis of $mathcalP_2$, so we can look at the corresponding transformation matrix:
$$F=M_mathcalC^mathcalB(f) = beginpmatrix1&0&0\0&1&0\0&0&1endpmatrix$$
$F$ and $A$ are equivalent. Take
$$Q=I_3text (the identity matrix), qquad P=frac14beginpmatrix2&-1&0\0&2&-2\0&0&2endpmatrix text (which is A^-1text)$$
Now set $mathcalB=, 2,2X+1,X^2 ,$. We obtain the corresponding transformation matrix:
$$B=M_mathcalB^mathcal B(f) = beginpmatrix2&1&0\0&2&1\0&0&2endpmatrix$$
which is the Jordan normal form of $A$ (or $f$). $B$ and $A$ are equivalent (both have rank $3$) and similar. But $F$ and $A$ are not similar ($A$ has $texttrace(A)=6$, where $texttrace(F)=3$).
answered Jul 27 at 7:59
Babelfish
408112
408112
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2862498%2fdifference-between-equivalent-matrices-and-similar-matrices%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
You missed one important detail: the "similar" case should talk only of linear operators (a.k.a. endomorphisms), i.e., linear maps from a space to itself. That is the key difference here: for these beasts there is only one basis to be chosen.
â Marc van Leeuwen
Jul 25 at 15:23
@MarcvanLeeuwen: Does it mean that if endomorphism are equivalent, then they are similar ?
â user386627
Jul 25 at 15:34
@user386627: Of course not. See my example in my answer.
â Surb
Jul 25 at 15:43
No it is the other way around: if they are similar, then certainly they are equivalent. But if you are using a (square) matrix to encode a linear operator, then there is not much point in considering the relation of equivalence at all, similarity is what you should care about then.
â Marc van Leeuwen
Jul 25 at 15:45