Geometrical visualization of Tensors

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
1
down vote

favorite












My question is about tensors. I have recently spent some time studying the various definitions of tensors and some tensor calculus. What I am missing now is an intuitive way to represent tensors and I really need it to understand subjects like General Relativity, so I would like to expose to you a way to understand tensor visually that I have found on the internet and ask you to check if it could be universal and well-fitting or simply wrong or not precise. Please keep in mind that I am not a mathematician but a physicist, so forgive me if I will not be rigorous.



I will begin with what I have understood about vectors and their representation.



A vector $vecv$ is an absolute object, it does not depend on anything but, eventually, on time, and it can be seen as an arrow in space (3D vector for example). When you choose a basis $vece_1, ..., vece_n$ you can represet the vector $vecv$ in two different ways (which coincide if the basis is orthonormal).
The first representation is obtained by counting how many vectors of the basis you have to add to obtain $vecv$ (parallelogram rule), the coefficients of the sum can be represented with up indices as $v^1,...,v^n$.
The second representation is obtained by taking the orthogonal projections of $vecv$ on the basis vectors, these projections can be represented with low indices as $v_1,...,v_n$.



Here comes my real question: Is it correct to say that a tensor T, say a rank 2 tensor, is an absolute object and can be seen as the "union" of 2 vectors, $vecv$ and $vecw$, and it can be represented in 4 different ways by a matrix of which the elements are obtained taking the product of the components of $vecv$ and $vecw$ expressed in the covariant and contravariant form, namely
$$T^munu = v^mucdot w^nu$$
$$T^mu_hspace0.3cmnu = v^mucdot w_nu$$
$$T_mu^hspace0.2cmnu = v_mucdot w^nu$$
$$T_munu = v_mucdot w_nu$$
So that these 4 matrices are just 4 different representations of the same object in a chosen basis, just like the covariant and contravariant representations of $vecv$ are just 2 ways to see an arrow in space?



If this "model" is not correct, what are the cases in which it can hold, if there are?



I stress here that I am not looking for a formal definition of tensors but just a simple way to represent them without losing any property.







share|cite|improve this question















  • 1




    Remark: Not all $2$-tensors are given by two vectors. Such tensors are called simple and a general tensor is only the sum of simple tensors.
    – Jan Bohr
    Jul 29 at 13:45










  • What about the tensors used in General Relativity (metric, energy-stress...), are they simple tensors? Thank you
    – Andrea
    Jul 30 at 6:55











  • No. In dimension greater than $1$, simple $2$-tensor are never definite. Hence the metric tensor cannot be simple.
    – Jan Bohr
    Jul 30 at 7:39















up vote
1
down vote

favorite












My question is about tensors. I have recently spent some time studying the various definitions of tensors and some tensor calculus. What I am missing now is an intuitive way to represent tensors and I really need it to understand subjects like General Relativity, so I would like to expose to you a way to understand tensor visually that I have found on the internet and ask you to check if it could be universal and well-fitting or simply wrong or not precise. Please keep in mind that I am not a mathematician but a physicist, so forgive me if I will not be rigorous.



I will begin with what I have understood about vectors and their representation.



A vector $vecv$ is an absolute object, it does not depend on anything but, eventually, on time, and it can be seen as an arrow in space (3D vector for example). When you choose a basis $vece_1, ..., vece_n$ you can represet the vector $vecv$ in two different ways (which coincide if the basis is orthonormal).
The first representation is obtained by counting how many vectors of the basis you have to add to obtain $vecv$ (parallelogram rule), the coefficients of the sum can be represented with up indices as $v^1,...,v^n$.
The second representation is obtained by taking the orthogonal projections of $vecv$ on the basis vectors, these projections can be represented with low indices as $v_1,...,v_n$.



Here comes my real question: Is it correct to say that a tensor T, say a rank 2 tensor, is an absolute object and can be seen as the "union" of 2 vectors, $vecv$ and $vecw$, and it can be represented in 4 different ways by a matrix of which the elements are obtained taking the product of the components of $vecv$ and $vecw$ expressed in the covariant and contravariant form, namely
$$T^munu = v^mucdot w^nu$$
$$T^mu_hspace0.3cmnu = v^mucdot w_nu$$
$$T_mu^hspace0.2cmnu = v_mucdot w^nu$$
$$T_munu = v_mucdot w_nu$$
So that these 4 matrices are just 4 different representations of the same object in a chosen basis, just like the covariant and contravariant representations of $vecv$ are just 2 ways to see an arrow in space?



If this "model" is not correct, what are the cases in which it can hold, if there are?



I stress here that I am not looking for a formal definition of tensors but just a simple way to represent them without losing any property.







share|cite|improve this question















  • 1




    Remark: Not all $2$-tensors are given by two vectors. Such tensors are called simple and a general tensor is only the sum of simple tensors.
    – Jan Bohr
    Jul 29 at 13:45










  • What about the tensors used in General Relativity (metric, energy-stress...), are they simple tensors? Thank you
    – Andrea
    Jul 30 at 6:55











  • No. In dimension greater than $1$, simple $2$-tensor are never definite. Hence the metric tensor cannot be simple.
    – Jan Bohr
    Jul 30 at 7:39













up vote
1
down vote

favorite









up vote
1
down vote

favorite











My question is about tensors. I have recently spent some time studying the various definitions of tensors and some tensor calculus. What I am missing now is an intuitive way to represent tensors and I really need it to understand subjects like General Relativity, so I would like to expose to you a way to understand tensor visually that I have found on the internet and ask you to check if it could be universal and well-fitting or simply wrong or not precise. Please keep in mind that I am not a mathematician but a physicist, so forgive me if I will not be rigorous.



I will begin with what I have understood about vectors and their representation.



A vector $vecv$ is an absolute object, it does not depend on anything but, eventually, on time, and it can be seen as an arrow in space (3D vector for example). When you choose a basis $vece_1, ..., vece_n$ you can represet the vector $vecv$ in two different ways (which coincide if the basis is orthonormal).
The first representation is obtained by counting how many vectors of the basis you have to add to obtain $vecv$ (parallelogram rule), the coefficients of the sum can be represented with up indices as $v^1,...,v^n$.
The second representation is obtained by taking the orthogonal projections of $vecv$ on the basis vectors, these projections can be represented with low indices as $v_1,...,v_n$.



Here comes my real question: Is it correct to say that a tensor T, say a rank 2 tensor, is an absolute object and can be seen as the "union" of 2 vectors, $vecv$ and $vecw$, and it can be represented in 4 different ways by a matrix of which the elements are obtained taking the product of the components of $vecv$ and $vecw$ expressed in the covariant and contravariant form, namely
$$T^munu = v^mucdot w^nu$$
$$T^mu_hspace0.3cmnu = v^mucdot w_nu$$
$$T_mu^hspace0.2cmnu = v_mucdot w^nu$$
$$T_munu = v_mucdot w_nu$$
So that these 4 matrices are just 4 different representations of the same object in a chosen basis, just like the covariant and contravariant representations of $vecv$ are just 2 ways to see an arrow in space?



If this "model" is not correct, what are the cases in which it can hold, if there are?



I stress here that I am not looking for a formal definition of tensors but just a simple way to represent them without losing any property.







share|cite|improve this question











My question is about tensors. I have recently spent some time studying the various definitions of tensors and some tensor calculus. What I am missing now is an intuitive way to represent tensors and I really need it to understand subjects like General Relativity, so I would like to expose to you a way to understand tensor visually that I have found on the internet and ask you to check if it could be universal and well-fitting or simply wrong or not precise. Please keep in mind that I am not a mathematician but a physicist, so forgive me if I will not be rigorous.



I will begin with what I have understood about vectors and their representation.



A vector $vecv$ is an absolute object, it does not depend on anything but, eventually, on time, and it can be seen as an arrow in space (3D vector for example). When you choose a basis $vece_1, ..., vece_n$ you can represet the vector $vecv$ in two different ways (which coincide if the basis is orthonormal).
The first representation is obtained by counting how many vectors of the basis you have to add to obtain $vecv$ (parallelogram rule), the coefficients of the sum can be represented with up indices as $v^1,...,v^n$.
The second representation is obtained by taking the orthogonal projections of $vecv$ on the basis vectors, these projections can be represented with low indices as $v_1,...,v_n$.



Here comes my real question: Is it correct to say that a tensor T, say a rank 2 tensor, is an absolute object and can be seen as the "union" of 2 vectors, $vecv$ and $vecw$, and it can be represented in 4 different ways by a matrix of which the elements are obtained taking the product of the components of $vecv$ and $vecw$ expressed in the covariant and contravariant form, namely
$$T^munu = v^mucdot w^nu$$
$$T^mu_hspace0.3cmnu = v^mucdot w_nu$$
$$T_mu^hspace0.2cmnu = v_mucdot w^nu$$
$$T_munu = v_mucdot w_nu$$
So that these 4 matrices are just 4 different representations of the same object in a chosen basis, just like the covariant and contravariant representations of $vecv$ are just 2 ways to see an arrow in space?



If this "model" is not correct, what are the cases in which it can hold, if there are?



I stress here that I am not looking for a formal definition of tensors but just a simple way to represent them without losing any property.









share|cite|improve this question










share|cite|improve this question




share|cite|improve this question









asked Jul 29 at 13:31









Andrea

62




62







  • 1




    Remark: Not all $2$-tensors are given by two vectors. Such tensors are called simple and a general tensor is only the sum of simple tensors.
    – Jan Bohr
    Jul 29 at 13:45










  • What about the tensors used in General Relativity (metric, energy-stress...), are they simple tensors? Thank you
    – Andrea
    Jul 30 at 6:55











  • No. In dimension greater than $1$, simple $2$-tensor are never definite. Hence the metric tensor cannot be simple.
    – Jan Bohr
    Jul 30 at 7:39













  • 1




    Remark: Not all $2$-tensors are given by two vectors. Such tensors are called simple and a general tensor is only the sum of simple tensors.
    – Jan Bohr
    Jul 29 at 13:45










  • What about the tensors used in General Relativity (metric, energy-stress...), are they simple tensors? Thank you
    – Andrea
    Jul 30 at 6:55











  • No. In dimension greater than $1$, simple $2$-tensor are never definite. Hence the metric tensor cannot be simple.
    – Jan Bohr
    Jul 30 at 7:39








1




1




Remark: Not all $2$-tensors are given by two vectors. Such tensors are called simple and a general tensor is only the sum of simple tensors.
– Jan Bohr
Jul 29 at 13:45




Remark: Not all $2$-tensors are given by two vectors. Such tensors are called simple and a general tensor is only the sum of simple tensors.
– Jan Bohr
Jul 29 at 13:45












What about the tensors used in General Relativity (metric, energy-stress...), are they simple tensors? Thank you
– Andrea
Jul 30 at 6:55





What about the tensors used in General Relativity (metric, energy-stress...), are they simple tensors? Thank you
– Andrea
Jul 30 at 6:55













No. In dimension greater than $1$, simple $2$-tensor are never definite. Hence the metric tensor cannot be simple.
– Jan Bohr
Jul 30 at 7:39





No. In dimension greater than $1$, simple $2$-tensor are never definite. Hence the metric tensor cannot be simple.
– Jan Bohr
Jul 30 at 7:39
















active

oldest

votes











Your Answer




StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: false,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);








 

draft saved


draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2866078%2fgeometrical-visualization-of-tensors%23new-answer', 'question_page');

);

Post as a guest



































active

oldest

votes













active

oldest

votes









active

oldest

votes






active

oldest

votes










 

draft saved


draft discarded


























 


draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2866078%2fgeometrical-visualization-of-tensors%23new-answer', 'question_page');

);

Post as a guest













































































Comments

Popular posts from this blog

What is the equation of a 3D cone with generalised tilt?

Color the edges and diagonals of a regular polygon

Relationship between determinant of matrix and determinant of adjoint?