Understanding the linear transformation given by matrix multiplication
Clash Royale CLAN TAG#URR8PPP
up vote
0
down vote
favorite
I am working on the following problem. I found it in an old qualifying exam, but I'm not sure of its original source. It asks:
Let $A$ be an $n times n$ matrix with entries in $mathbbR$. Let $phi_A colon mathbbR^n to mathbbR^n$ be the linear transformation defined by $phi_A(v) = A cdot v$ for each column vector $v in mathbbR^n$. Set $W = v in mathbbR^n mid phi_A(v) = v $. and assume that $dim ker phi_A + dim W = n$.
- Give the minimal polynomial for $A$.
- Describe all possible Jordan canonical forms for $A$.
- Prove that if $A$ is a symmetric matrix, then $W$ is orthogonal to $ker phi_A$. Assume that $mathbbR^n$ is endowed with its standard inner product.
I don't understand the different between saying that $A$ is a matrix for $[T]_beta$ with respect to some basis $beta$ of $mathbbR^n$ and saying that $phi_A$ is the linear transformation defined by $phi_A(v) = A(v)$.
In any case, here are the rest of my thoughts: It appears that, given the restriction on the dimensions of $ker phi_A$ and $dim W$ that $A$ is an idempotent matrix. That is, it either fixes or zeros an arbitrary vector $v in mathbbR^n$. It follows that the minimal polynomial is $M_A(x) = x(x-1)$.
Since the minimal polynomial is a product of distinct linear factors, $A$ is diagonalizable. Thus if $r = dim ker phi_A = dim operatornamenull A$ then there are $r$ 0's on the diagonal and $n-r$ 1's on the diagonal. There are a total of $n-1$ such diagonal matrices, up to permutation of the basis.
For the last part, if $A$ is a real symmetric matrix, then it is self-adjoint. For any $v in W$ and $u in operatornamenull A$ we have
$$(u,v) = (Au,v) = (u,A^*v) = (u,Av) = (u,0) = 0 $$
where the last equality follows since our space is equipped with the standard inner product.
linear-algebra linear-transformations jordan-normal-form idempotents
add a comment |Â
up vote
0
down vote
favorite
I am working on the following problem. I found it in an old qualifying exam, but I'm not sure of its original source. It asks:
Let $A$ be an $n times n$ matrix with entries in $mathbbR$. Let $phi_A colon mathbbR^n to mathbbR^n$ be the linear transformation defined by $phi_A(v) = A cdot v$ for each column vector $v in mathbbR^n$. Set $W = v in mathbbR^n mid phi_A(v) = v $. and assume that $dim ker phi_A + dim W = n$.
- Give the minimal polynomial for $A$.
- Describe all possible Jordan canonical forms for $A$.
- Prove that if $A$ is a symmetric matrix, then $W$ is orthogonal to $ker phi_A$. Assume that $mathbbR^n$ is endowed with its standard inner product.
I don't understand the different between saying that $A$ is a matrix for $[T]_beta$ with respect to some basis $beta$ of $mathbbR^n$ and saying that $phi_A$ is the linear transformation defined by $phi_A(v) = A(v)$.
In any case, here are the rest of my thoughts: It appears that, given the restriction on the dimensions of $ker phi_A$ and $dim W$ that $A$ is an idempotent matrix. That is, it either fixes or zeros an arbitrary vector $v in mathbbR^n$. It follows that the minimal polynomial is $M_A(x) = x(x-1)$.
Since the minimal polynomial is a product of distinct linear factors, $A$ is diagonalizable. Thus if $r = dim ker phi_A = dim operatornamenull A$ then there are $r$ 0's on the diagonal and $n-r$ 1's on the diagonal. There are a total of $n-1$ such diagonal matrices, up to permutation of the basis.
For the last part, if $A$ is a real symmetric matrix, then it is self-adjoint. For any $v in W$ and $u in operatornamenull A$ we have
$$(u,v) = (Au,v) = (u,A^*v) = (u,Av) = (u,0) = 0 $$
where the last equality follows since our space is equipped with the standard inner product.
linear-algebra linear-transformations jordan-normal-form idempotents
add a comment |Â
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I am working on the following problem. I found it in an old qualifying exam, but I'm not sure of its original source. It asks:
Let $A$ be an $n times n$ matrix with entries in $mathbbR$. Let $phi_A colon mathbbR^n to mathbbR^n$ be the linear transformation defined by $phi_A(v) = A cdot v$ for each column vector $v in mathbbR^n$. Set $W = v in mathbbR^n mid phi_A(v) = v $. and assume that $dim ker phi_A + dim W = n$.
- Give the minimal polynomial for $A$.
- Describe all possible Jordan canonical forms for $A$.
- Prove that if $A$ is a symmetric matrix, then $W$ is orthogonal to $ker phi_A$. Assume that $mathbbR^n$ is endowed with its standard inner product.
I don't understand the different between saying that $A$ is a matrix for $[T]_beta$ with respect to some basis $beta$ of $mathbbR^n$ and saying that $phi_A$ is the linear transformation defined by $phi_A(v) = A(v)$.
In any case, here are the rest of my thoughts: It appears that, given the restriction on the dimensions of $ker phi_A$ and $dim W$ that $A$ is an idempotent matrix. That is, it either fixes or zeros an arbitrary vector $v in mathbbR^n$. It follows that the minimal polynomial is $M_A(x) = x(x-1)$.
Since the minimal polynomial is a product of distinct linear factors, $A$ is diagonalizable. Thus if $r = dim ker phi_A = dim operatornamenull A$ then there are $r$ 0's on the diagonal and $n-r$ 1's on the diagonal. There are a total of $n-1$ such diagonal matrices, up to permutation of the basis.
For the last part, if $A$ is a real symmetric matrix, then it is self-adjoint. For any $v in W$ and $u in operatornamenull A$ we have
$$(u,v) = (Au,v) = (u,A^*v) = (u,Av) = (u,0) = 0 $$
where the last equality follows since our space is equipped with the standard inner product.
linear-algebra linear-transformations jordan-normal-form idempotents
I am working on the following problem. I found it in an old qualifying exam, but I'm not sure of its original source. It asks:
Let $A$ be an $n times n$ matrix with entries in $mathbbR$. Let $phi_A colon mathbbR^n to mathbbR^n$ be the linear transformation defined by $phi_A(v) = A cdot v$ for each column vector $v in mathbbR^n$. Set $W = v in mathbbR^n mid phi_A(v) = v $. and assume that $dim ker phi_A + dim W = n$.
- Give the minimal polynomial for $A$.
- Describe all possible Jordan canonical forms for $A$.
- Prove that if $A$ is a symmetric matrix, then $W$ is orthogonal to $ker phi_A$. Assume that $mathbbR^n$ is endowed with its standard inner product.
I don't understand the different between saying that $A$ is a matrix for $[T]_beta$ with respect to some basis $beta$ of $mathbbR^n$ and saying that $phi_A$ is the linear transformation defined by $phi_A(v) = A(v)$.
In any case, here are the rest of my thoughts: It appears that, given the restriction on the dimensions of $ker phi_A$ and $dim W$ that $A$ is an idempotent matrix. That is, it either fixes or zeros an arbitrary vector $v in mathbbR^n$. It follows that the minimal polynomial is $M_A(x) = x(x-1)$.
Since the minimal polynomial is a product of distinct linear factors, $A$ is diagonalizable. Thus if $r = dim ker phi_A = dim operatornamenull A$ then there are $r$ 0's on the diagonal and $n-r$ 1's on the diagonal. There are a total of $n-1$ such diagonal matrices, up to permutation of the basis.
For the last part, if $A$ is a real symmetric matrix, then it is self-adjoint. For any $v in W$ and $u in operatornamenull A$ we have
$$(u,v) = (Au,v) = (u,A^*v) = (u,Av) = (u,0) = 0 $$
where the last equality follows since our space is equipped with the standard inner product.
linear-algebra linear-transformations jordan-normal-form idempotents
edited Aug 1 at 18:25
Bernard
110k635102
110k635102
asked Aug 1 at 18:11
misogrumpy
608
608
add a comment |Â
add a comment |Â
1 Answer
1
active
oldest
votes
up vote
1
down vote
accepted
For your first question, if you define a linear map as $phi(v)=phi_A(v)=Av$ for $vinmathbbR^n$ then you essentially define $phi$ to be the linear map with representation $A$ in the standard basis.
To see this, just note that $phi(e_i)$ is the $i$-th column of $A$ where $e_i$ is the $i$-th standard basis vector.
This $phi$ is then uniquely determined. You can of course define some $psi$ as the map represented to some other basis $beta$ by $A$. This $psi$ will also be uniquely determined but in general not have the slightest in common with $phi$.
Note further, that the set $W$, the set of fixed points of $phi$, is essentially the eigenspace for the eigenvalue $1$. With the assertion that $mathrmdim(mathrmker(phi))+mathrmdim(W)=n$ we can infer that
$$
mathbbR^n=mathrmker(phi)oplus W
$$
To see this, we note that $mathrmker(phi)cap W=mathbf0$, as if $vinmathrmker(phi)cap W$, then $mathbf0=phi(v)=v$. The rest follows from the dimension formula for sums of subspaces:
$$
mathrmdim(U_1+U_2)=mathrmdim(U_1)+mathrmdim(U_2)-mathrmdim(U_1cap U_2)
$$
Note that this especially means that we can diagonalize $phi$ with value $0$ and $1$ as we may choose a basis $(b_1,dots,b_r)$ for $mathrmker(phi)$ and $(b'_1,dots,b'_n-r)$ for $W$ which then together form a basis $(b_1,dots,b_r,b'_1,dots,b'_n-r)$ for $mathbbR^n$ as $W$ and $mathrmker(phi)$ are in direct sum equal to $mathbbR^n$. This matrix, in my choice of basis, is given by
$$
mathrmdiag(0,dots,0,1,dots,1)
$$
and is indeed idempotent and thus we identify(as we could have before) $phi$ as a projection of $mathbbR^n$.
Followingly the minimal polynomial is $X(X-1)$ as you rightly remarked. I just wanted to provide a different viewpoint as how to arrive at the minimal polynomial over diagonalization and direct sums as it fits perfectly into this projection scenario. Note that up to permutation of the basis, this diagonal matrix is its one Jordan Normal Form.
Note that a projection always has $X(X-1)$ as a minimal polynomial if it is non-trivial and may thus always be diagonalized with such a matrix representation.
You last remark concerning the orthogonality of $W$ and $mathrmker(phi)$ is perfectly fine.
Awesome. What you said about $phi$ being represented by $A$ with respect to the standard basis makes sense. Would it be wrong to say that $ker phi = null A$. Then to notice that since $n = ker phi oplus W$ that the eigenspaces corresponding to the eigenvalues 0 and 1 consist of a basis of eigenvectors (as opposed to generalized eigenspaces). Thus the matrix is diagonalizable. The note about the intersection being trivial is important! I should realized to account for that. Thanks again!
– misogrumpy
Aug 1 at 23:29
No, $mathrmkerphi$ and $mathrmnullphi$ are just to different ways of writing this space.
– zzuussee
Aug 2 at 8:14
add a comment |Â
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
accepted
For your first question, if you define a linear map as $phi(v)=phi_A(v)=Av$ for $vinmathbbR^n$ then you essentially define $phi$ to be the linear map with representation $A$ in the standard basis.
To see this, just note that $phi(e_i)$ is the $i$-th column of $A$ where $e_i$ is the $i$-th standard basis vector.
This $phi$ is then uniquely determined. You can of course define some $psi$ as the map represented to some other basis $beta$ by $A$. This $psi$ will also be uniquely determined but in general not have the slightest in common with $phi$.
Note further, that the set $W$, the set of fixed points of $phi$, is essentially the eigenspace for the eigenvalue $1$. With the assertion that $mathrmdim(mathrmker(phi))+mathrmdim(W)=n$ we can infer that
$$
mathbbR^n=mathrmker(phi)oplus W
$$
To see this, we note that $mathrmker(phi)cap W=mathbf0$, as if $vinmathrmker(phi)cap W$, then $mathbf0=phi(v)=v$. The rest follows from the dimension formula for sums of subspaces:
$$
mathrmdim(U_1+U_2)=mathrmdim(U_1)+mathrmdim(U_2)-mathrmdim(U_1cap U_2)
$$
Note that this especially means that we can diagonalize $phi$ with value $0$ and $1$ as we may choose a basis $(b_1,dots,b_r)$ for $mathrmker(phi)$ and $(b'_1,dots,b'_n-r)$ for $W$ which then together form a basis $(b_1,dots,b_r,b'_1,dots,b'_n-r)$ for $mathbbR^n$ as $W$ and $mathrmker(phi)$ are in direct sum equal to $mathbbR^n$. This matrix, in my choice of basis, is given by
$$
mathrmdiag(0,dots,0,1,dots,1)
$$
and is indeed idempotent and thus we identify(as we could have before) $phi$ as a projection of $mathbbR^n$.
Followingly the minimal polynomial is $X(X-1)$ as you rightly remarked. I just wanted to provide a different viewpoint as how to arrive at the minimal polynomial over diagonalization and direct sums as it fits perfectly into this projection scenario. Note that up to permutation of the basis, this diagonal matrix is its one Jordan Normal Form.
Note that a projection always has $X(X-1)$ as a minimal polynomial if it is non-trivial and may thus always be diagonalized with such a matrix representation.
You last remark concerning the orthogonality of $W$ and $mathrmker(phi)$ is perfectly fine.
Awesome. What you said about $phi$ being represented by $A$ with respect to the standard basis makes sense. Would it be wrong to say that $ker phi = null A$. Then to notice that since $n = ker phi oplus W$ that the eigenspaces corresponding to the eigenvalues 0 and 1 consist of a basis of eigenvectors (as opposed to generalized eigenspaces). Thus the matrix is diagonalizable. The note about the intersection being trivial is important! I should realized to account for that. Thanks again!
– misogrumpy
Aug 1 at 23:29
No, $mathrmkerphi$ and $mathrmnullphi$ are just to different ways of writing this space.
– zzuussee
Aug 2 at 8:14
add a comment |Â
up vote
1
down vote
accepted
For your first question, if you define a linear map as $phi(v)=phi_A(v)=Av$ for $vinmathbbR^n$ then you essentially define $phi$ to be the linear map with representation $A$ in the standard basis.
To see this, just note that $phi(e_i)$ is the $i$-th column of $A$ where $e_i$ is the $i$-th standard basis vector.
This $phi$ is then uniquely determined. You can of course define some $psi$ as the map represented to some other basis $beta$ by $A$. This $psi$ will also be uniquely determined but in general not have the slightest in common with $phi$.
Note further, that the set $W$, the set of fixed points of $phi$, is essentially the eigenspace for the eigenvalue $1$. With the assertion that $mathrmdim(mathrmker(phi))+mathrmdim(W)=n$ we can infer that
$$
mathbbR^n=mathrmker(phi)oplus W
$$
To see this, we note that $mathrmker(phi)cap W=mathbf0$, as if $vinmathrmker(phi)cap W$, then $mathbf0=phi(v)=v$. The rest follows from the dimension formula for sums of subspaces:
$$
mathrmdim(U_1+U_2)=mathrmdim(U_1)+mathrmdim(U_2)-mathrmdim(U_1cap U_2)
$$
Note that this especially means that we can diagonalize $phi$ with value $0$ and $1$ as we may choose a basis $(b_1,dots,b_r)$ for $mathrmker(phi)$ and $(b'_1,dots,b'_n-r)$ for $W$ which then together form a basis $(b_1,dots,b_r,b'_1,dots,b'_n-r)$ for $mathbbR^n$ as $W$ and $mathrmker(phi)$ are in direct sum equal to $mathbbR^n$. This matrix, in my choice of basis, is given by
$$
mathrmdiag(0,dots,0,1,dots,1)
$$
and is indeed idempotent and thus we identify(as we could have before) $phi$ as a projection of $mathbbR^n$.
Followingly the minimal polynomial is $X(X-1)$ as you rightly remarked. I just wanted to provide a different viewpoint as how to arrive at the minimal polynomial over diagonalization and direct sums as it fits perfectly into this projection scenario. Note that up to permutation of the basis, this diagonal matrix is its one Jordan Normal Form.
Note that a projection always has $X(X-1)$ as a minimal polynomial if it is non-trivial and may thus always be diagonalized with such a matrix representation.
You last remark concerning the orthogonality of $W$ and $mathrmker(phi)$ is perfectly fine.
Awesome. What you said about $phi$ being represented by $A$ with respect to the standard basis makes sense. Would it be wrong to say that $ker phi = null A$. Then to notice that since $n = ker phi oplus W$ that the eigenspaces corresponding to the eigenvalues 0 and 1 consist of a basis of eigenvectors (as opposed to generalized eigenspaces). Thus the matrix is diagonalizable. The note about the intersection being trivial is important! I should realized to account for that. Thanks again!
– misogrumpy
Aug 1 at 23:29
No, $mathrmkerphi$ and $mathrmnullphi$ are just to different ways of writing this space.
– zzuussee
Aug 2 at 8:14
add a comment |Â
up vote
1
down vote
accepted
up vote
1
down vote
accepted
For your first question, if you define a linear map as $phi(v)=phi_A(v)=Av$ for $vinmathbbR^n$ then you essentially define $phi$ to be the linear map with representation $A$ in the standard basis.
To see this, just note that $phi(e_i)$ is the $i$-th column of $A$ where $e_i$ is the $i$-th standard basis vector.
This $phi$ is then uniquely determined. You can of course define some $psi$ as the map represented to some other basis $beta$ by $A$. This $psi$ will also be uniquely determined but in general not have the slightest in common with $phi$.
Note further, that the set $W$, the set of fixed points of $phi$, is essentially the eigenspace for the eigenvalue $1$. With the assertion that $mathrmdim(mathrmker(phi))+mathrmdim(W)=n$ we can infer that
$$
mathbbR^n=mathrmker(phi)oplus W
$$
To see this, we note that $mathrmker(phi)cap W=mathbf0$, as if $vinmathrmker(phi)cap W$, then $mathbf0=phi(v)=v$. The rest follows from the dimension formula for sums of subspaces:
$$
mathrmdim(U_1+U_2)=mathrmdim(U_1)+mathrmdim(U_2)-mathrmdim(U_1cap U_2)
$$
Note that this especially means that we can diagonalize $phi$ with value $0$ and $1$ as we may choose a basis $(b_1,dots,b_r)$ for $mathrmker(phi)$ and $(b'_1,dots,b'_n-r)$ for $W$ which then together form a basis $(b_1,dots,b_r,b'_1,dots,b'_n-r)$ for $mathbbR^n$ as $W$ and $mathrmker(phi)$ are in direct sum equal to $mathbbR^n$. This matrix, in my choice of basis, is given by
$$
mathrmdiag(0,dots,0,1,dots,1)
$$
and is indeed idempotent and thus we identify(as we could have before) $phi$ as a projection of $mathbbR^n$.
Followingly the minimal polynomial is $X(X-1)$ as you rightly remarked. I just wanted to provide a different viewpoint as how to arrive at the minimal polynomial over diagonalization and direct sums as it fits perfectly into this projection scenario. Note that up to permutation of the basis, this diagonal matrix is its one Jordan Normal Form.
Note that a projection always has $X(X-1)$ as a minimal polynomial if it is non-trivial and may thus always be diagonalized with such a matrix representation.
You last remark concerning the orthogonality of $W$ and $mathrmker(phi)$ is perfectly fine.
For your first question, if you define a linear map as $phi(v)=phi_A(v)=Av$ for $vinmathbbR^n$ then you essentially define $phi$ to be the linear map with representation $A$ in the standard basis.
To see this, just note that $phi(e_i)$ is the $i$-th column of $A$ where $e_i$ is the $i$-th standard basis vector.
This $phi$ is then uniquely determined. You can of course define some $psi$ as the map represented to some other basis $beta$ by $A$. This $psi$ will also be uniquely determined but in general not have the slightest in common with $phi$.
Note further, that the set $W$, the set of fixed points of $phi$, is essentially the eigenspace for the eigenvalue $1$. With the assertion that $mathrmdim(mathrmker(phi))+mathrmdim(W)=n$ we can infer that
$$
mathbbR^n=mathrmker(phi)oplus W
$$
To see this, we note that $mathrmker(phi)cap W=mathbf0$, as if $vinmathrmker(phi)cap W$, then $mathbf0=phi(v)=v$. The rest follows from the dimension formula for sums of subspaces:
$$
mathrmdim(U_1+U_2)=mathrmdim(U_1)+mathrmdim(U_2)-mathrmdim(U_1cap U_2)
$$
Note that this especially means that we can diagonalize $phi$ with value $0$ and $1$ as we may choose a basis $(b_1,dots,b_r)$ for $mathrmker(phi)$ and $(b'_1,dots,b'_n-r)$ for $W$ which then together form a basis $(b_1,dots,b_r,b'_1,dots,b'_n-r)$ for $mathbbR^n$ as $W$ and $mathrmker(phi)$ are in direct sum equal to $mathbbR^n$. This matrix, in my choice of basis, is given by
$$
mathrmdiag(0,dots,0,1,dots,1)
$$
and is indeed idempotent and thus we identify(as we could have before) $phi$ as a projection of $mathbbR^n$.
Followingly the minimal polynomial is $X(X-1)$ as you rightly remarked. I just wanted to provide a different viewpoint as how to arrive at the minimal polynomial over diagonalization and direct sums as it fits perfectly into this projection scenario. Note that up to permutation of the basis, this diagonal matrix is its one Jordan Normal Form.
Note that a projection always has $X(X-1)$ as a minimal polynomial if it is non-trivial and may thus always be diagonalized with such a matrix representation.
You last remark concerning the orthogonality of $W$ and $mathrmker(phi)$ is perfectly fine.
edited Aug 1 at 19:15
answered Aug 1 at 19:07


zzuussee
1,152419
1,152419
Awesome. What you said about $phi$ being represented by $A$ with respect to the standard basis makes sense. Would it be wrong to say that $ker phi = null A$. Then to notice that since $n = ker phi oplus W$ that the eigenspaces corresponding to the eigenvalues 0 and 1 consist of a basis of eigenvectors (as opposed to generalized eigenspaces). Thus the matrix is diagonalizable. The note about the intersection being trivial is important! I should realized to account for that. Thanks again!
– misogrumpy
Aug 1 at 23:29
No, $mathrmkerphi$ and $mathrmnullphi$ are just to different ways of writing this space.
– zzuussee
Aug 2 at 8:14
add a comment |Â
Awesome. What you said about $phi$ being represented by $A$ with respect to the standard basis makes sense. Would it be wrong to say that $ker phi = null A$. Then to notice that since $n = ker phi oplus W$ that the eigenspaces corresponding to the eigenvalues 0 and 1 consist of a basis of eigenvectors (as opposed to generalized eigenspaces). Thus the matrix is diagonalizable. The note about the intersection being trivial is important! I should realized to account for that. Thanks again!
– misogrumpy
Aug 1 at 23:29
No, $mathrmkerphi$ and $mathrmnullphi$ are just to different ways of writing this space.
– zzuussee
Aug 2 at 8:14
Awesome. What you said about $phi$ being represented by $A$ with respect to the standard basis makes sense. Would it be wrong to say that $ker phi = null A$. Then to notice that since $n = ker phi oplus W$ that the eigenspaces corresponding to the eigenvalues 0 and 1 consist of a basis of eigenvectors (as opposed to generalized eigenspaces). Thus the matrix is diagonalizable. The note about the intersection being trivial is important! I should realized to account for that. Thanks again!
– misogrumpy
Aug 1 at 23:29
Awesome. What you said about $phi$ being represented by $A$ with respect to the standard basis makes sense. Would it be wrong to say that $ker phi = null A$. Then to notice that since $n = ker phi oplus W$ that the eigenspaces corresponding to the eigenvalues 0 and 1 consist of a basis of eigenvectors (as opposed to generalized eigenspaces). Thus the matrix is diagonalizable. The note about the intersection being trivial is important! I should realized to account for that. Thanks again!
– misogrumpy
Aug 1 at 23:29
No, $mathrmkerphi$ and $mathrmnullphi$ are just to different ways of writing this space.
– zzuussee
Aug 2 at 8:14
No, $mathrmkerphi$ and $mathrmnullphi$ are just to different ways of writing this space.
– zzuussee
Aug 2 at 8:14
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2869358%2funderstanding-the-linear-transformation-given-by-matrix-multiplication%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password