What is the motivation behind defining tensor product?

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
3
down vote

favorite
2












In my undergraduate math course we have tensor calculus. I am not getting the motivations of defining such thing, definition of tensor product and feeling lack of interest in the topic. Can anyone explain why tensor product is defined as it is? or, provide a link.



I am using the definition of Tensor product as it is defined here.







share|cite|improve this question





















  • well, there is more than one defnition possible. Maybe you give us a hint with which one you are working?
    – Thomas
    Jul 25 at 15:39






  • 3




    It represents an interesting functor....
    – Lord Shark the Unknown
    Jul 25 at 15:40











  • It is possible to think about tensors without using tensor products. However, tensor products allow us to encode "multilinear" tensors as a linear maps.
    – Omnomnomnom
    Jul 25 at 15:40







  • 1




    See here.
    – Pedro Tamaroff♦
    Jul 25 at 16:08










  • Thank you @PedroTamaroff
    – Ken Ono
    Jul 25 at 16:24














up vote
3
down vote

favorite
2












In my undergraduate math course we have tensor calculus. I am not getting the motivations of defining such thing, definition of tensor product and feeling lack of interest in the topic. Can anyone explain why tensor product is defined as it is? or, provide a link.



I am using the definition of Tensor product as it is defined here.







share|cite|improve this question





















  • well, there is more than one defnition possible. Maybe you give us a hint with which one you are working?
    – Thomas
    Jul 25 at 15:39






  • 3




    It represents an interesting functor....
    – Lord Shark the Unknown
    Jul 25 at 15:40











  • It is possible to think about tensors without using tensor products. However, tensor products allow us to encode "multilinear" tensors as a linear maps.
    – Omnomnomnom
    Jul 25 at 15:40







  • 1




    See here.
    – Pedro Tamaroff♦
    Jul 25 at 16:08










  • Thank you @PedroTamaroff
    – Ken Ono
    Jul 25 at 16:24












up vote
3
down vote

favorite
2









up vote
3
down vote

favorite
2






2





In my undergraduate math course we have tensor calculus. I am not getting the motivations of defining such thing, definition of tensor product and feeling lack of interest in the topic. Can anyone explain why tensor product is defined as it is? or, provide a link.



I am using the definition of Tensor product as it is defined here.







share|cite|improve this question













In my undergraduate math course we have tensor calculus. I am not getting the motivations of defining such thing, definition of tensor product and feeling lack of interest in the topic. Can anyone explain why tensor product is defined as it is? or, provide a link.



I am using the definition of Tensor product as it is defined here.









share|cite|improve this question












share|cite|improve this question




share|cite|improve this question








edited Jul 25 at 15:56
























asked Jul 25 at 15:33









Ken Ono

404




404











  • well, there is more than one defnition possible. Maybe you give us a hint with which one you are working?
    – Thomas
    Jul 25 at 15:39






  • 3




    It represents an interesting functor....
    – Lord Shark the Unknown
    Jul 25 at 15:40











  • It is possible to think about tensors without using tensor products. However, tensor products allow us to encode "multilinear" tensors as a linear maps.
    – Omnomnomnom
    Jul 25 at 15:40







  • 1




    See here.
    – Pedro Tamaroff♦
    Jul 25 at 16:08










  • Thank you @PedroTamaroff
    – Ken Ono
    Jul 25 at 16:24
















  • well, there is more than one defnition possible. Maybe you give us a hint with which one you are working?
    – Thomas
    Jul 25 at 15:39






  • 3




    It represents an interesting functor....
    – Lord Shark the Unknown
    Jul 25 at 15:40











  • It is possible to think about tensors without using tensor products. However, tensor products allow us to encode "multilinear" tensors as a linear maps.
    – Omnomnomnom
    Jul 25 at 15:40







  • 1




    See here.
    – Pedro Tamaroff♦
    Jul 25 at 16:08










  • Thank you @PedroTamaroff
    – Ken Ono
    Jul 25 at 16:24















well, there is more than one defnition possible. Maybe you give us a hint with which one you are working?
– Thomas
Jul 25 at 15:39




well, there is more than one defnition possible. Maybe you give us a hint with which one you are working?
– Thomas
Jul 25 at 15:39




3




3




It represents an interesting functor....
– Lord Shark the Unknown
Jul 25 at 15:40





It represents an interesting functor....
– Lord Shark the Unknown
Jul 25 at 15:40













It is possible to think about tensors without using tensor products. However, tensor products allow us to encode "multilinear" tensors as a linear maps.
– Omnomnomnom
Jul 25 at 15:40





It is possible to think about tensors without using tensor products. However, tensor products allow us to encode "multilinear" tensors as a linear maps.
– Omnomnomnom
Jul 25 at 15:40





1




1




See here.
– Pedro Tamaroff♦
Jul 25 at 16:08




See here.
– Pedro Tamaroff♦
Jul 25 at 16:08












Thank you @PedroTamaroff
– Ken Ono
Jul 25 at 16:24




Thank you @PedroTamaroff
– Ken Ono
Jul 25 at 16:24










2 Answers
2






active

oldest

votes

















up vote
2
down vote



accepted










I would like to add a real example of relevant tensor-product spaces (from quantum theory, but simplified). Maybe it is a bit too complicated, but
for me, it shows the difference between cartesian products and tensor products in the best way!



A long introduction.



We want to work with continuous functions $f, g in C^0(mathbb R)$.
You might want to consider those functions to be some probability densities, which say something like "the probability of a quantum particle to be at point $x$ is $f(x)$". (In this example $C^0$ is the 'vector space' and later we will see how $C^0 otimes C^0$ looks like.)



Now, if $f$ and $g$ are different densities for different systems, say A and B, we might want to ask for the probability of system A to be at state $x$ and system B to be at state $y$ at the same time. This probability will be given as $f(x) cdot g(y)$.



Now how many different density distributions for (A, B) exist?



If A and B are independet, then we simply use something like $C^0(mathbb R) times C^0(mathbb R)$ to describe the densities as two splited functions $f, g$. This space would include two-dimensional densities which
are the product $f(x) cdot g(y)$ of two functions, for example function like in the following picture.



enter image description here



But there are more interesting two-dimensional densities, like this one:
enter image description here



This function is not the product like $f(x) cdot g(y)$, instead it is more something like $f_1(x) cdot g_1(y) + f_2(x) cdot g_2(y) notin C^0(mathbb R) times C^0(mathbb R)$.



Finally, Tensor-product spaces!
This matches perfectly with the definition of tensor-product spaces:



You take vectors from the individual spaces, (here $f_i, g_j in C^0(mathbb R)$)
and you combine them to a new 'abstract' vector $f_iotimes g_i in C^0(mathbb R) otimes C^0(mathbb R)$.



In this new abstract tensor-product space, you also can add two pure vectors and get more complicated vectors, for example like in the second plot
$$ f_1 otimes g_1 + f_2 otimes g_2 in C^0(mathbb R) otimes C^0(mathbb R) approx C^0(mathbb R^2)$$.



Going further.



This example is kind of trivial and only captures a specific situation. But there are many similar, but non-trivial cases, where interesting spaces can be seen as tensor-product spaces. Applications are plenty and can be found (for example) in differential geometry, numerical analysis, computer graphics, measure theory and functional analysis.
Of course, abstract objects, like the tensor-product, are more complicated and is requires some training to use them in practical situations... Like often in math, there is always a trade-off between learning a general theory and how to apply it to concrete examples versus learning only the tools you really need and risk too learn the same stuff twice in different settings without noticing it. Both approaches are understandable.






share|cite|improve this answer




























    up vote
    0
    down vote













    I apologize for the avoided details, but I think this is a good starting picture for a undergrad (an explanation through geometry).



    We generalize a $k$-linear map to a $k$-tensor because typically in a problem you must switch between coordinate systems, metrics, etc. Speaking from a geometer's perspective, linear maps will hold geometric information, but since these maps are typically defined over vector spaces, you now have to figure out a basis which makes your problem doable.



    Once decided, it would be nice if your map also changed appropriately with respect to your transformation i.e you would like to not have to start the whole problem over again, picking the correct basis and so forth. The start over process is long due to the fact that sometimes geometric information is contained in linear maps which are pairings of vectors and their duals. These things change differently and so starting over is double work.



    Thus, Tensor calculus emerges as a field which links notation with computation; as well as allowing you to do much longer computations at a very fast rate. To explain which tensors are interesting (i.e which things you would like to "tensorize") is a different question, but I hope this helps.



    I'll also amend a funny joke that was told to me, "Differential Geometry started off as the study of things which are invariant under rotation, but has turned into the study of things which are invariant under notation."






    share|cite|improve this answer























      Your Answer




      StackExchange.ifUsing("editor", function ()
      return StackExchange.using("mathjaxEditing", function ()
      StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
      StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
      );
      );
      , "mathjax-editing");

      StackExchange.ready(function()
      var channelOptions =
      tags: "".split(" "),
      id: "69"
      ;
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function()
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled)
      StackExchange.using("snippets", function()
      createEditor();
      );

      else
      createEditor();

      );

      function createEditor()
      StackExchange.prepareEditor(
      heartbeatType: 'answer',
      convertImagesToLinks: true,
      noModals: false,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      );



      );








       

      draft saved


      draft discarded


















      StackExchange.ready(
      function ()
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2862535%2fwhat-is-the-motivation-behind-defining-tensor-product%23new-answer', 'question_page');

      );

      Post as a guest






























      2 Answers
      2






      active

      oldest

      votes








      2 Answers
      2






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes








      up vote
      2
      down vote



      accepted










      I would like to add a real example of relevant tensor-product spaces (from quantum theory, but simplified). Maybe it is a bit too complicated, but
      for me, it shows the difference between cartesian products and tensor products in the best way!



      A long introduction.



      We want to work with continuous functions $f, g in C^0(mathbb R)$.
      You might want to consider those functions to be some probability densities, which say something like "the probability of a quantum particle to be at point $x$ is $f(x)$". (In this example $C^0$ is the 'vector space' and later we will see how $C^0 otimes C^0$ looks like.)



      Now, if $f$ and $g$ are different densities for different systems, say A and B, we might want to ask for the probability of system A to be at state $x$ and system B to be at state $y$ at the same time. This probability will be given as $f(x) cdot g(y)$.



      Now how many different density distributions for (A, B) exist?



      If A and B are independet, then we simply use something like $C^0(mathbb R) times C^0(mathbb R)$ to describe the densities as two splited functions $f, g$. This space would include two-dimensional densities which
      are the product $f(x) cdot g(y)$ of two functions, for example function like in the following picture.



      enter image description here



      But there are more interesting two-dimensional densities, like this one:
      enter image description here



      This function is not the product like $f(x) cdot g(y)$, instead it is more something like $f_1(x) cdot g_1(y) + f_2(x) cdot g_2(y) notin C^0(mathbb R) times C^0(mathbb R)$.



      Finally, Tensor-product spaces!
      This matches perfectly with the definition of tensor-product spaces:



      You take vectors from the individual spaces, (here $f_i, g_j in C^0(mathbb R)$)
      and you combine them to a new 'abstract' vector $f_iotimes g_i in C^0(mathbb R) otimes C^0(mathbb R)$.



      In this new abstract tensor-product space, you also can add two pure vectors and get more complicated vectors, for example like in the second plot
      $$ f_1 otimes g_1 + f_2 otimes g_2 in C^0(mathbb R) otimes C^0(mathbb R) approx C^0(mathbb R^2)$$.



      Going further.



      This example is kind of trivial and only captures a specific situation. But there are many similar, but non-trivial cases, where interesting spaces can be seen as tensor-product spaces. Applications are plenty and can be found (for example) in differential geometry, numerical analysis, computer graphics, measure theory and functional analysis.
      Of course, abstract objects, like the tensor-product, are more complicated and is requires some training to use them in practical situations... Like often in math, there is always a trade-off between learning a general theory and how to apply it to concrete examples versus learning only the tools you really need and risk too learn the same stuff twice in different settings without noticing it. Both approaches are understandable.






      share|cite|improve this answer

























        up vote
        2
        down vote



        accepted










        I would like to add a real example of relevant tensor-product spaces (from quantum theory, but simplified). Maybe it is a bit too complicated, but
        for me, it shows the difference between cartesian products and tensor products in the best way!



        A long introduction.



        We want to work with continuous functions $f, g in C^0(mathbb R)$.
        You might want to consider those functions to be some probability densities, which say something like "the probability of a quantum particle to be at point $x$ is $f(x)$". (In this example $C^0$ is the 'vector space' and later we will see how $C^0 otimes C^0$ looks like.)



        Now, if $f$ and $g$ are different densities for different systems, say A and B, we might want to ask for the probability of system A to be at state $x$ and system B to be at state $y$ at the same time. This probability will be given as $f(x) cdot g(y)$.



        Now how many different density distributions for (A, B) exist?



        If A and B are independet, then we simply use something like $C^0(mathbb R) times C^0(mathbb R)$ to describe the densities as two splited functions $f, g$. This space would include two-dimensional densities which
        are the product $f(x) cdot g(y)$ of two functions, for example function like in the following picture.



        enter image description here



        But there are more interesting two-dimensional densities, like this one:
        enter image description here



        This function is not the product like $f(x) cdot g(y)$, instead it is more something like $f_1(x) cdot g_1(y) + f_2(x) cdot g_2(y) notin C^0(mathbb R) times C^0(mathbb R)$.



        Finally, Tensor-product spaces!
        This matches perfectly with the definition of tensor-product spaces:



        You take vectors from the individual spaces, (here $f_i, g_j in C^0(mathbb R)$)
        and you combine them to a new 'abstract' vector $f_iotimes g_i in C^0(mathbb R) otimes C^0(mathbb R)$.



        In this new abstract tensor-product space, you also can add two pure vectors and get more complicated vectors, for example like in the second plot
        $$ f_1 otimes g_1 + f_2 otimes g_2 in C^0(mathbb R) otimes C^0(mathbb R) approx C^0(mathbb R^2)$$.



        Going further.



        This example is kind of trivial and only captures a specific situation. But there are many similar, but non-trivial cases, where interesting spaces can be seen as tensor-product spaces. Applications are plenty and can be found (for example) in differential geometry, numerical analysis, computer graphics, measure theory and functional analysis.
        Of course, abstract objects, like the tensor-product, are more complicated and is requires some training to use them in practical situations... Like often in math, there is always a trade-off between learning a general theory and how to apply it to concrete examples versus learning only the tools you really need and risk too learn the same stuff twice in different settings without noticing it. Both approaches are understandable.






        share|cite|improve this answer























          up vote
          2
          down vote



          accepted







          up vote
          2
          down vote



          accepted






          I would like to add a real example of relevant tensor-product spaces (from quantum theory, but simplified). Maybe it is a bit too complicated, but
          for me, it shows the difference between cartesian products and tensor products in the best way!



          A long introduction.



          We want to work with continuous functions $f, g in C^0(mathbb R)$.
          You might want to consider those functions to be some probability densities, which say something like "the probability of a quantum particle to be at point $x$ is $f(x)$". (In this example $C^0$ is the 'vector space' and later we will see how $C^0 otimes C^0$ looks like.)



          Now, if $f$ and $g$ are different densities for different systems, say A and B, we might want to ask for the probability of system A to be at state $x$ and system B to be at state $y$ at the same time. This probability will be given as $f(x) cdot g(y)$.



          Now how many different density distributions for (A, B) exist?



          If A and B are independet, then we simply use something like $C^0(mathbb R) times C^0(mathbb R)$ to describe the densities as two splited functions $f, g$. This space would include two-dimensional densities which
          are the product $f(x) cdot g(y)$ of two functions, for example function like in the following picture.



          enter image description here



          But there are more interesting two-dimensional densities, like this one:
          enter image description here



          This function is not the product like $f(x) cdot g(y)$, instead it is more something like $f_1(x) cdot g_1(y) + f_2(x) cdot g_2(y) notin C^0(mathbb R) times C^0(mathbb R)$.



          Finally, Tensor-product spaces!
          This matches perfectly with the definition of tensor-product spaces:



          You take vectors from the individual spaces, (here $f_i, g_j in C^0(mathbb R)$)
          and you combine them to a new 'abstract' vector $f_iotimes g_i in C^0(mathbb R) otimes C^0(mathbb R)$.



          In this new abstract tensor-product space, you also can add two pure vectors and get more complicated vectors, for example like in the second plot
          $$ f_1 otimes g_1 + f_2 otimes g_2 in C^0(mathbb R) otimes C^0(mathbb R) approx C^0(mathbb R^2)$$.



          Going further.



          This example is kind of trivial and only captures a specific situation. But there are many similar, but non-trivial cases, where interesting spaces can be seen as tensor-product spaces. Applications are plenty and can be found (for example) in differential geometry, numerical analysis, computer graphics, measure theory and functional analysis.
          Of course, abstract objects, like the tensor-product, are more complicated and is requires some training to use them in practical situations... Like often in math, there is always a trade-off between learning a general theory and how to apply it to concrete examples versus learning only the tools you really need and risk too learn the same stuff twice in different settings without noticing it. Both approaches are understandable.






          share|cite|improve this answer













          I would like to add a real example of relevant tensor-product spaces (from quantum theory, but simplified). Maybe it is a bit too complicated, but
          for me, it shows the difference between cartesian products and tensor products in the best way!



          A long introduction.



          We want to work with continuous functions $f, g in C^0(mathbb R)$.
          You might want to consider those functions to be some probability densities, which say something like "the probability of a quantum particle to be at point $x$ is $f(x)$". (In this example $C^0$ is the 'vector space' and later we will see how $C^0 otimes C^0$ looks like.)



          Now, if $f$ and $g$ are different densities for different systems, say A and B, we might want to ask for the probability of system A to be at state $x$ and system B to be at state $y$ at the same time. This probability will be given as $f(x) cdot g(y)$.



          Now how many different density distributions for (A, B) exist?



          If A and B are independet, then we simply use something like $C^0(mathbb R) times C^0(mathbb R)$ to describe the densities as two splited functions $f, g$. This space would include two-dimensional densities which
          are the product $f(x) cdot g(y)$ of two functions, for example function like in the following picture.



          enter image description here



          But there are more interesting two-dimensional densities, like this one:
          enter image description here



          This function is not the product like $f(x) cdot g(y)$, instead it is more something like $f_1(x) cdot g_1(y) + f_2(x) cdot g_2(y) notin C^0(mathbb R) times C^0(mathbb R)$.



          Finally, Tensor-product spaces!
          This matches perfectly with the definition of tensor-product spaces:



          You take vectors from the individual spaces, (here $f_i, g_j in C^0(mathbb R)$)
          and you combine them to a new 'abstract' vector $f_iotimes g_i in C^0(mathbb R) otimes C^0(mathbb R)$.



          In this new abstract tensor-product space, you also can add two pure vectors and get more complicated vectors, for example like in the second plot
          $$ f_1 otimes g_1 + f_2 otimes g_2 in C^0(mathbb R) otimes C^0(mathbb R) approx C^0(mathbb R^2)$$.



          Going further.



          This example is kind of trivial and only captures a specific situation. But there are many similar, but non-trivial cases, where interesting spaces can be seen as tensor-product spaces. Applications are plenty and can be found (for example) in differential geometry, numerical analysis, computer graphics, measure theory and functional analysis.
          Of course, abstract objects, like the tensor-product, are more complicated and is requires some training to use them in practical situations... Like often in math, there is always a trade-off between learning a general theory and how to apply it to concrete examples versus learning only the tools you really need and risk too learn the same stuff twice in different settings without noticing it. Both approaches are understandable.







          share|cite|improve this answer













          share|cite|improve this answer



          share|cite|improve this answer











          answered Jul 28 at 23:55









          Steffen Plunder

          45829




          45829




















              up vote
              0
              down vote













              I apologize for the avoided details, but I think this is a good starting picture for a undergrad (an explanation through geometry).



              We generalize a $k$-linear map to a $k$-tensor because typically in a problem you must switch between coordinate systems, metrics, etc. Speaking from a geometer's perspective, linear maps will hold geometric information, but since these maps are typically defined over vector spaces, you now have to figure out a basis which makes your problem doable.



              Once decided, it would be nice if your map also changed appropriately with respect to your transformation i.e you would like to not have to start the whole problem over again, picking the correct basis and so forth. The start over process is long due to the fact that sometimes geometric information is contained in linear maps which are pairings of vectors and their duals. These things change differently and so starting over is double work.



              Thus, Tensor calculus emerges as a field which links notation with computation; as well as allowing you to do much longer computations at a very fast rate. To explain which tensors are interesting (i.e which things you would like to "tensorize") is a different question, but I hope this helps.



              I'll also amend a funny joke that was told to me, "Differential Geometry started off as the study of things which are invariant under rotation, but has turned into the study of things which are invariant under notation."






              share|cite|improve this answer



























                up vote
                0
                down vote













                I apologize for the avoided details, but I think this is a good starting picture for a undergrad (an explanation through geometry).



                We generalize a $k$-linear map to a $k$-tensor because typically in a problem you must switch between coordinate systems, metrics, etc. Speaking from a geometer's perspective, linear maps will hold geometric information, but since these maps are typically defined over vector spaces, you now have to figure out a basis which makes your problem doable.



                Once decided, it would be nice if your map also changed appropriately with respect to your transformation i.e you would like to not have to start the whole problem over again, picking the correct basis and so forth. The start over process is long due to the fact that sometimes geometric information is contained in linear maps which are pairings of vectors and their duals. These things change differently and so starting over is double work.



                Thus, Tensor calculus emerges as a field which links notation with computation; as well as allowing you to do much longer computations at a very fast rate. To explain which tensors are interesting (i.e which things you would like to "tensorize") is a different question, but I hope this helps.



                I'll also amend a funny joke that was told to me, "Differential Geometry started off as the study of things which are invariant under rotation, but has turned into the study of things which are invariant under notation."






                share|cite|improve this answer

























                  up vote
                  0
                  down vote










                  up vote
                  0
                  down vote









                  I apologize for the avoided details, but I think this is a good starting picture for a undergrad (an explanation through geometry).



                  We generalize a $k$-linear map to a $k$-tensor because typically in a problem you must switch between coordinate systems, metrics, etc. Speaking from a geometer's perspective, linear maps will hold geometric information, but since these maps are typically defined over vector spaces, you now have to figure out a basis which makes your problem doable.



                  Once decided, it would be nice if your map also changed appropriately with respect to your transformation i.e you would like to not have to start the whole problem over again, picking the correct basis and so forth. The start over process is long due to the fact that sometimes geometric information is contained in linear maps which are pairings of vectors and their duals. These things change differently and so starting over is double work.



                  Thus, Tensor calculus emerges as a field which links notation with computation; as well as allowing you to do much longer computations at a very fast rate. To explain which tensors are interesting (i.e which things you would like to "tensorize") is a different question, but I hope this helps.



                  I'll also amend a funny joke that was told to me, "Differential Geometry started off as the study of things which are invariant under rotation, but has turned into the study of things which are invariant under notation."






                  share|cite|improve this answer















                  I apologize for the avoided details, but I think this is a good starting picture for a undergrad (an explanation through geometry).



                  We generalize a $k$-linear map to a $k$-tensor because typically in a problem you must switch between coordinate systems, metrics, etc. Speaking from a geometer's perspective, linear maps will hold geometric information, but since these maps are typically defined over vector spaces, you now have to figure out a basis which makes your problem doable.



                  Once decided, it would be nice if your map also changed appropriately with respect to your transformation i.e you would like to not have to start the whole problem over again, picking the correct basis and so forth. The start over process is long due to the fact that sometimes geometric information is contained in linear maps which are pairings of vectors and their duals. These things change differently and so starting over is double work.



                  Thus, Tensor calculus emerges as a field which links notation with computation; as well as allowing you to do much longer computations at a very fast rate. To explain which tensors are interesting (i.e which things you would like to "tensorize") is a different question, but I hope this helps.



                  I'll also amend a funny joke that was told to me, "Differential Geometry started off as the study of things which are invariant under rotation, but has turned into the study of things which are invariant under notation."







                  share|cite|improve this answer















                  share|cite|improve this answer



                  share|cite|improve this answer








                  edited Jul 26 at 1:38


























                  answered Jul 25 at 17:54









                  Faraad Armwood

                  7,4292619




                  7,4292619






















                       

                      draft saved


                      draft discarded


























                       


                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function ()
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2862535%2fwhat-is-the-motivation-behind-defining-tensor-product%23new-answer', 'question_page');

                      );

                      Post as a guest













































































                      Comments

                      Popular posts from this blog

                      What is the equation of a 3D cone with generalised tilt?

                      Relationship between determinant of matrix and determinant of adjoint?

                      Color the edges and diagonals of a regular polygon