Geometrical visualization of Tensors
Clash Royale CLAN TAG#URR8PPP
up vote
1
down vote
favorite
My question is about tensors. I have recently spent some time studying the various definitions of tensors and some tensor calculus. What I am missing now is an intuitive way to represent tensors and I really need it to understand subjects like General Relativity, so I would like to expose to you a way to understand tensor visually that I have found on the internet and ask you to check if it could be universal and well-fitting or simply wrong or not precise. Please keep in mind that I am not a mathematician but a physicist, so forgive me if I will not be rigorous.
I will begin with what I have understood about vectors and their representation.
A vector $vecv$ is an absolute object, it does not depend on anything but, eventually, on time, and it can be seen as an arrow in space (3D vector for example). When you choose a basis $vece_1, ..., vece_n$ you can represet the vector $vecv$ in two different ways (which coincide if the basis is orthonormal).
The first representation is obtained by counting how many vectors of the basis you have to add to obtain $vecv$ (parallelogram rule), the coefficients of the sum can be represented with up indices as $v^1,...,v^n$.
The second representation is obtained by taking the orthogonal projections of $vecv$ on the basis vectors, these projections can be represented with low indices as $v_1,...,v_n$.
Here comes my real question: Is it correct to say that a tensor T, say a rank 2 tensor, is an absolute object and can be seen as the "union" of 2 vectors, $vecv$ and $vecw$, and it can be represented in 4 different ways by a matrix of which the elements are obtained taking the product of the components of $vecv$ and $vecw$ expressed in the covariant and contravariant form, namely
$$T^munu = v^mucdot w^nu$$
$$T^mu_hspace0.3cmnu = v^mucdot w_nu$$
$$T_mu^hspace0.2cmnu = v_mucdot w^nu$$
$$T_munu = v_mucdot w_nu$$
So that these 4 matrices are just 4 different representations of the same object in a chosen basis, just like the covariant and contravariant representations of $vecv$ are just 2 ways to see an arrow in space?
If this "model" is not correct, what are the cases in which it can hold, if there are?
I stress here that I am not looking for a formal definition of tensors but just a simple way to represent them without losing any property.
intuition tensors visualization
add a comment |Â
up vote
1
down vote
favorite
My question is about tensors. I have recently spent some time studying the various definitions of tensors and some tensor calculus. What I am missing now is an intuitive way to represent tensors and I really need it to understand subjects like General Relativity, so I would like to expose to you a way to understand tensor visually that I have found on the internet and ask you to check if it could be universal and well-fitting or simply wrong or not precise. Please keep in mind that I am not a mathematician but a physicist, so forgive me if I will not be rigorous.
I will begin with what I have understood about vectors and their representation.
A vector $vecv$ is an absolute object, it does not depend on anything but, eventually, on time, and it can be seen as an arrow in space (3D vector for example). When you choose a basis $vece_1, ..., vece_n$ you can represet the vector $vecv$ in two different ways (which coincide if the basis is orthonormal).
The first representation is obtained by counting how many vectors of the basis you have to add to obtain $vecv$ (parallelogram rule), the coefficients of the sum can be represented with up indices as $v^1,...,v^n$.
The second representation is obtained by taking the orthogonal projections of $vecv$ on the basis vectors, these projections can be represented with low indices as $v_1,...,v_n$.
Here comes my real question: Is it correct to say that a tensor T, say a rank 2 tensor, is an absolute object and can be seen as the "union" of 2 vectors, $vecv$ and $vecw$, and it can be represented in 4 different ways by a matrix of which the elements are obtained taking the product of the components of $vecv$ and $vecw$ expressed in the covariant and contravariant form, namely
$$T^munu = v^mucdot w^nu$$
$$T^mu_hspace0.3cmnu = v^mucdot w_nu$$
$$T_mu^hspace0.2cmnu = v_mucdot w^nu$$
$$T_munu = v_mucdot w_nu$$
So that these 4 matrices are just 4 different representations of the same object in a chosen basis, just like the covariant and contravariant representations of $vecv$ are just 2 ways to see an arrow in space?
If this "model" is not correct, what are the cases in which it can hold, if there are?
I stress here that I am not looking for a formal definition of tensors but just a simple way to represent them without losing any property.
intuition tensors visualization
1
Remark: Not all $2$-tensors are given by two vectors. Such tensors are called simple and a general tensor is only the sum of simple tensors.
– Jan Bohr
Jul 29 at 13:45
What about the tensors used in General Relativity (metric, energy-stress...), are they simple tensors? Thank you
– Andrea
Jul 30 at 6:55
No. In dimension greater than $1$, simple $2$-tensor are never definite. Hence the metric tensor cannot be simple.
– Jan Bohr
Jul 30 at 7:39
add a comment |Â
up vote
1
down vote
favorite
up vote
1
down vote
favorite
My question is about tensors. I have recently spent some time studying the various definitions of tensors and some tensor calculus. What I am missing now is an intuitive way to represent tensors and I really need it to understand subjects like General Relativity, so I would like to expose to you a way to understand tensor visually that I have found on the internet and ask you to check if it could be universal and well-fitting or simply wrong or not precise. Please keep in mind that I am not a mathematician but a physicist, so forgive me if I will not be rigorous.
I will begin with what I have understood about vectors and their representation.
A vector $vecv$ is an absolute object, it does not depend on anything but, eventually, on time, and it can be seen as an arrow in space (3D vector for example). When you choose a basis $vece_1, ..., vece_n$ you can represet the vector $vecv$ in two different ways (which coincide if the basis is orthonormal).
The first representation is obtained by counting how many vectors of the basis you have to add to obtain $vecv$ (parallelogram rule), the coefficients of the sum can be represented with up indices as $v^1,...,v^n$.
The second representation is obtained by taking the orthogonal projections of $vecv$ on the basis vectors, these projections can be represented with low indices as $v_1,...,v_n$.
Here comes my real question: Is it correct to say that a tensor T, say a rank 2 tensor, is an absolute object and can be seen as the "union" of 2 vectors, $vecv$ and $vecw$, and it can be represented in 4 different ways by a matrix of which the elements are obtained taking the product of the components of $vecv$ and $vecw$ expressed in the covariant and contravariant form, namely
$$T^munu = v^mucdot w^nu$$
$$T^mu_hspace0.3cmnu = v^mucdot w_nu$$
$$T_mu^hspace0.2cmnu = v_mucdot w^nu$$
$$T_munu = v_mucdot w_nu$$
So that these 4 matrices are just 4 different representations of the same object in a chosen basis, just like the covariant and contravariant representations of $vecv$ are just 2 ways to see an arrow in space?
If this "model" is not correct, what are the cases in which it can hold, if there are?
I stress here that I am not looking for a formal definition of tensors but just a simple way to represent them without losing any property.
intuition tensors visualization
My question is about tensors. I have recently spent some time studying the various definitions of tensors and some tensor calculus. What I am missing now is an intuitive way to represent tensors and I really need it to understand subjects like General Relativity, so I would like to expose to you a way to understand tensor visually that I have found on the internet and ask you to check if it could be universal and well-fitting or simply wrong or not precise. Please keep in mind that I am not a mathematician but a physicist, so forgive me if I will not be rigorous.
I will begin with what I have understood about vectors and their representation.
A vector $vecv$ is an absolute object, it does not depend on anything but, eventually, on time, and it can be seen as an arrow in space (3D vector for example). When you choose a basis $vece_1, ..., vece_n$ you can represet the vector $vecv$ in two different ways (which coincide if the basis is orthonormal).
The first representation is obtained by counting how many vectors of the basis you have to add to obtain $vecv$ (parallelogram rule), the coefficients of the sum can be represented with up indices as $v^1,...,v^n$.
The second representation is obtained by taking the orthogonal projections of $vecv$ on the basis vectors, these projections can be represented with low indices as $v_1,...,v_n$.
Here comes my real question: Is it correct to say that a tensor T, say a rank 2 tensor, is an absolute object and can be seen as the "union" of 2 vectors, $vecv$ and $vecw$, and it can be represented in 4 different ways by a matrix of which the elements are obtained taking the product of the components of $vecv$ and $vecw$ expressed in the covariant and contravariant form, namely
$$T^munu = v^mucdot w^nu$$
$$T^mu_hspace0.3cmnu = v^mucdot w_nu$$
$$T_mu^hspace0.2cmnu = v_mucdot w^nu$$
$$T_munu = v_mucdot w_nu$$
So that these 4 matrices are just 4 different representations of the same object in a chosen basis, just like the covariant and contravariant representations of $vecv$ are just 2 ways to see an arrow in space?
If this "model" is not correct, what are the cases in which it can hold, if there are?
I stress here that I am not looking for a formal definition of tensors but just a simple way to represent them without losing any property.
intuition tensors visualization
asked Jul 29 at 13:31
Andrea
62
62
1
Remark: Not all $2$-tensors are given by two vectors. Such tensors are called simple and a general tensor is only the sum of simple tensors.
– Jan Bohr
Jul 29 at 13:45
What about the tensors used in General Relativity (metric, energy-stress...), are they simple tensors? Thank you
– Andrea
Jul 30 at 6:55
No. In dimension greater than $1$, simple $2$-tensor are never definite. Hence the metric tensor cannot be simple.
– Jan Bohr
Jul 30 at 7:39
add a comment |Â
1
Remark: Not all $2$-tensors are given by two vectors. Such tensors are called simple and a general tensor is only the sum of simple tensors.
– Jan Bohr
Jul 29 at 13:45
What about the tensors used in General Relativity (metric, energy-stress...), are they simple tensors? Thank you
– Andrea
Jul 30 at 6:55
No. In dimension greater than $1$, simple $2$-tensor are never definite. Hence the metric tensor cannot be simple.
– Jan Bohr
Jul 30 at 7:39
1
1
Remark: Not all $2$-tensors are given by two vectors. Such tensors are called simple and a general tensor is only the sum of simple tensors.
– Jan Bohr
Jul 29 at 13:45
Remark: Not all $2$-tensors are given by two vectors. Such tensors are called simple and a general tensor is only the sum of simple tensors.
– Jan Bohr
Jul 29 at 13:45
What about the tensors used in General Relativity (metric, energy-stress...), are they simple tensors? Thank you
– Andrea
Jul 30 at 6:55
What about the tensors used in General Relativity (metric, energy-stress...), are they simple tensors? Thank you
– Andrea
Jul 30 at 6:55
No. In dimension greater than $1$, simple $2$-tensor are never definite. Hence the metric tensor cannot be simple.
– Jan Bohr
Jul 30 at 7:39
No. In dimension greater than $1$, simple $2$-tensor are never definite. Hence the metric tensor cannot be simple.
– Jan Bohr
Jul 30 at 7:39
add a comment |Â
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2866078%2fgeometrical-visualization-of-tensors%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
1
Remark: Not all $2$-tensors are given by two vectors. Such tensors are called simple and a general tensor is only the sum of simple tensors.
– Jan Bohr
Jul 29 at 13:45
What about the tensors used in General Relativity (metric, energy-stress...), are they simple tensors? Thank you
– Andrea
Jul 30 at 6:55
No. In dimension greater than $1$, simple $2$-tensor are never definite. Hence the metric tensor cannot be simple.
– Jan Bohr
Jul 30 at 7:39