Proving that uncountable matrix multiplication is associative

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
5
down vote

favorite
1












I have been looking for uncountable large structures that generalize the idea of matrix multiplication and noticed the following pattern



Given functions $M_1(x,y) , M_2(x,y)$ defined on the unit square $[0,1] times [0,1]$ one can define their "matrix product" as



$$ M_3(x,y) = int_0^1 M_1 (s, y) M_2 (x, 1-s) ds = ( M_1 M_2 )$$



If we try to approximate this integral with riemann sums, we find that the approximations are computationally equivalent to multiplying ever higher order matrices (to produce a third matrix), but it's not clear that associativity is preserved as we take the limit to forming an integral.



My work:



for 3 functions A,B,C I expressed $A(BC)$ and $(AB)C$ in terms of nested integrals but now am not clear on how to show that the 2 expressions are equal. One idea was to perturb the definition above to: $ int_-infty^infty M_1 (s, y) M_2 (x, -s) ds$ and then to show associativity for this simpler looking operation, [it then is easy to show that maps sending the open unit square to the whole plane, composed with this operation, and then said map's inverse is equivalent to the earlier operation]







share|cite|improve this question















  • 2




    I think you could follow the same formal approach as showing matrix multiplication is associative, but use fubinis theorem when you need to switch the order of integrals (instead of sums)
    – Nick Alger
    Jul 19 at 19:45










  • Formal matrix multiplication I prove by showing 2x2 matrix multiplication is associative. Then use induction to show that $2^ntimes 2^n$ matrix multiplication is associative for all finite $n$.. Then show that general $ktimes k$ matrix multiplication can embed into $2^lceil log k rceil times 2^lceil log k rceil $ multiplication.
    – frogeyedpeas
    Jul 19 at 19:49











  • The countable structure to induct on here is what I lack. And some argument showing that convergent Riemann sums of $A(BC)$ and $(AB)C$ are equivalent feels too tedious to do.
    – frogeyedpeas
    Jul 19 at 19:51











  • It might also work to view $M_1$ as defining a linear operator $C[0,1]to C[0,1]$, $f mapsto left(y mapsto int_0^1 M_1(t, y) f(1-t) , dtright)$ and then show that this matrix multiplication corresponds to composition. Then it would remain to show this mapping of "matrices" to linear operators is injective. (Of course, to fully reproduce the usual argument along these lines from the finite case, we might need to extend the domain of the linear operators to include distributions like $delta(x - x_0)$.)
    – Daniel Schepler
    Jul 19 at 20:15















up vote
5
down vote

favorite
1












I have been looking for uncountable large structures that generalize the idea of matrix multiplication and noticed the following pattern



Given functions $M_1(x,y) , M_2(x,y)$ defined on the unit square $[0,1] times [0,1]$ one can define their "matrix product" as



$$ M_3(x,y) = int_0^1 M_1 (s, y) M_2 (x, 1-s) ds = ( M_1 M_2 )$$



If we try to approximate this integral with riemann sums, we find that the approximations are computationally equivalent to multiplying ever higher order matrices (to produce a third matrix), but it's not clear that associativity is preserved as we take the limit to forming an integral.



My work:



for 3 functions A,B,C I expressed $A(BC)$ and $(AB)C$ in terms of nested integrals but now am not clear on how to show that the 2 expressions are equal. One idea was to perturb the definition above to: $ int_-infty^infty M_1 (s, y) M_2 (x, -s) ds$ and then to show associativity for this simpler looking operation, [it then is easy to show that maps sending the open unit square to the whole plane, composed with this operation, and then said map's inverse is equivalent to the earlier operation]







share|cite|improve this question















  • 2




    I think you could follow the same formal approach as showing matrix multiplication is associative, but use fubinis theorem when you need to switch the order of integrals (instead of sums)
    – Nick Alger
    Jul 19 at 19:45










  • Formal matrix multiplication I prove by showing 2x2 matrix multiplication is associative. Then use induction to show that $2^ntimes 2^n$ matrix multiplication is associative for all finite $n$.. Then show that general $ktimes k$ matrix multiplication can embed into $2^lceil log k rceil times 2^lceil log k rceil $ multiplication.
    – frogeyedpeas
    Jul 19 at 19:49











  • The countable structure to induct on here is what I lack. And some argument showing that convergent Riemann sums of $A(BC)$ and $(AB)C$ are equivalent feels too tedious to do.
    – frogeyedpeas
    Jul 19 at 19:51











  • It might also work to view $M_1$ as defining a linear operator $C[0,1]to C[0,1]$, $f mapsto left(y mapsto int_0^1 M_1(t, y) f(1-t) , dtright)$ and then show that this matrix multiplication corresponds to composition. Then it would remain to show this mapping of "matrices" to linear operators is injective. (Of course, to fully reproduce the usual argument along these lines from the finite case, we might need to extend the domain of the linear operators to include distributions like $delta(x - x_0)$.)
    – Daniel Schepler
    Jul 19 at 20:15













up vote
5
down vote

favorite
1









up vote
5
down vote

favorite
1






1





I have been looking for uncountable large structures that generalize the idea of matrix multiplication and noticed the following pattern



Given functions $M_1(x,y) , M_2(x,y)$ defined on the unit square $[0,1] times [0,1]$ one can define their "matrix product" as



$$ M_3(x,y) = int_0^1 M_1 (s, y) M_2 (x, 1-s) ds = ( M_1 M_2 )$$



If we try to approximate this integral with riemann sums, we find that the approximations are computationally equivalent to multiplying ever higher order matrices (to produce a third matrix), but it's not clear that associativity is preserved as we take the limit to forming an integral.



My work:



for 3 functions A,B,C I expressed $A(BC)$ and $(AB)C$ in terms of nested integrals but now am not clear on how to show that the 2 expressions are equal. One idea was to perturb the definition above to: $ int_-infty^infty M_1 (s, y) M_2 (x, -s) ds$ and then to show associativity for this simpler looking operation, [it then is easy to show that maps sending the open unit square to the whole plane, composed with this operation, and then said map's inverse is equivalent to the earlier operation]







share|cite|improve this question











I have been looking for uncountable large structures that generalize the idea of matrix multiplication and noticed the following pattern



Given functions $M_1(x,y) , M_2(x,y)$ defined on the unit square $[0,1] times [0,1]$ one can define their "matrix product" as



$$ M_3(x,y) = int_0^1 M_1 (s, y) M_2 (x, 1-s) ds = ( M_1 M_2 )$$



If we try to approximate this integral with riemann sums, we find that the approximations are computationally equivalent to multiplying ever higher order matrices (to produce a third matrix), but it's not clear that associativity is preserved as we take the limit to forming an integral.



My work:



for 3 functions A,B,C I expressed $A(BC)$ and $(AB)C$ in terms of nested integrals but now am not clear on how to show that the 2 expressions are equal. One idea was to perturb the definition above to: $ int_-infty^infty M_1 (s, y) M_2 (x, -s) ds$ and then to show associativity for this simpler looking operation, [it then is easy to show that maps sending the open unit square to the whole plane, composed with this operation, and then said map's inverse is equivalent to the earlier operation]









share|cite|improve this question










share|cite|improve this question




share|cite|improve this question









asked Jul 19 at 19:38









frogeyedpeas

6,96271747




6,96271747







  • 2




    I think you could follow the same formal approach as showing matrix multiplication is associative, but use fubinis theorem when you need to switch the order of integrals (instead of sums)
    – Nick Alger
    Jul 19 at 19:45










  • Formal matrix multiplication I prove by showing 2x2 matrix multiplication is associative. Then use induction to show that $2^ntimes 2^n$ matrix multiplication is associative for all finite $n$.. Then show that general $ktimes k$ matrix multiplication can embed into $2^lceil log k rceil times 2^lceil log k rceil $ multiplication.
    – frogeyedpeas
    Jul 19 at 19:49











  • The countable structure to induct on here is what I lack. And some argument showing that convergent Riemann sums of $A(BC)$ and $(AB)C$ are equivalent feels too tedious to do.
    – frogeyedpeas
    Jul 19 at 19:51











  • It might also work to view $M_1$ as defining a linear operator $C[0,1]to C[0,1]$, $f mapsto left(y mapsto int_0^1 M_1(t, y) f(1-t) , dtright)$ and then show that this matrix multiplication corresponds to composition. Then it would remain to show this mapping of "matrices" to linear operators is injective. (Of course, to fully reproduce the usual argument along these lines from the finite case, we might need to extend the domain of the linear operators to include distributions like $delta(x - x_0)$.)
    – Daniel Schepler
    Jul 19 at 20:15













  • 2




    I think you could follow the same formal approach as showing matrix multiplication is associative, but use fubinis theorem when you need to switch the order of integrals (instead of sums)
    – Nick Alger
    Jul 19 at 19:45










  • Formal matrix multiplication I prove by showing 2x2 matrix multiplication is associative. Then use induction to show that $2^ntimes 2^n$ matrix multiplication is associative for all finite $n$.. Then show that general $ktimes k$ matrix multiplication can embed into $2^lceil log k rceil times 2^lceil log k rceil $ multiplication.
    – frogeyedpeas
    Jul 19 at 19:49











  • The countable structure to induct on here is what I lack. And some argument showing that convergent Riemann sums of $A(BC)$ and $(AB)C$ are equivalent feels too tedious to do.
    – frogeyedpeas
    Jul 19 at 19:51











  • It might also work to view $M_1$ as defining a linear operator $C[0,1]to C[0,1]$, $f mapsto left(y mapsto int_0^1 M_1(t, y) f(1-t) , dtright)$ and then show that this matrix multiplication corresponds to composition. Then it would remain to show this mapping of "matrices" to linear operators is injective. (Of course, to fully reproduce the usual argument along these lines from the finite case, we might need to extend the domain of the linear operators to include distributions like $delta(x - x_0)$.)
    – Daniel Schepler
    Jul 19 at 20:15








2




2




I think you could follow the same formal approach as showing matrix multiplication is associative, but use fubinis theorem when you need to switch the order of integrals (instead of sums)
– Nick Alger
Jul 19 at 19:45




I think you could follow the same formal approach as showing matrix multiplication is associative, but use fubinis theorem when you need to switch the order of integrals (instead of sums)
– Nick Alger
Jul 19 at 19:45












Formal matrix multiplication I prove by showing 2x2 matrix multiplication is associative. Then use induction to show that $2^ntimes 2^n$ matrix multiplication is associative for all finite $n$.. Then show that general $ktimes k$ matrix multiplication can embed into $2^lceil log k rceil times 2^lceil log k rceil $ multiplication.
– frogeyedpeas
Jul 19 at 19:49





Formal matrix multiplication I prove by showing 2x2 matrix multiplication is associative. Then use induction to show that $2^ntimes 2^n$ matrix multiplication is associative for all finite $n$.. Then show that general $ktimes k$ matrix multiplication can embed into $2^lceil log k rceil times 2^lceil log k rceil $ multiplication.
– frogeyedpeas
Jul 19 at 19:49













The countable structure to induct on here is what I lack. And some argument showing that convergent Riemann sums of $A(BC)$ and $(AB)C$ are equivalent feels too tedious to do.
– frogeyedpeas
Jul 19 at 19:51





The countable structure to induct on here is what I lack. And some argument showing that convergent Riemann sums of $A(BC)$ and $(AB)C$ are equivalent feels too tedious to do.
– frogeyedpeas
Jul 19 at 19:51













It might also work to view $M_1$ as defining a linear operator $C[0,1]to C[0,1]$, $f mapsto left(y mapsto int_0^1 M_1(t, y) f(1-t) , dtright)$ and then show that this matrix multiplication corresponds to composition. Then it would remain to show this mapping of "matrices" to linear operators is injective. (Of course, to fully reproduce the usual argument along these lines from the finite case, we might need to extend the domain of the linear operators to include distributions like $delta(x - x_0)$.)
– Daniel Schepler
Jul 19 at 20:15





It might also work to view $M_1$ as defining a linear operator $C[0,1]to C[0,1]$, $f mapsto left(y mapsto int_0^1 M_1(t, y) f(1-t) , dtright)$ and then show that this matrix multiplication corresponds to composition. Then it would remain to show this mapping of "matrices" to linear operators is injective. (Of course, to fully reproduce the usual argument along these lines from the finite case, we might need to extend the domain of the linear operators to include distributions like $delta(x - x_0)$.)
– Daniel Schepler
Jul 19 at 20:15











1 Answer
1






active

oldest

votes

















up vote
1
down vote













Let $A,B,C:[0,1]^2to mathbb R$ be matrices.
$$
beginalign
((AB)C)(x,y)
&= int_0^1 (AB)(s,y)C(x,1-s),ds
\&=int_0^1left(int_0^1A(t,y)B(s,1-t),dtright)C(x,1-s),ds
\&=int_0^1int_0^1A(t,y)B(s,1-t)C(x,1-s),dt,ds
endalign$$$$
beginalign
(A(BC))(x,y)
&= int_0^1 A(s,y)(BC)(x,1-s),ds
\&=int_0^1A(s,y)left(int_0^1B(t,1-s)C(x,1-t),dtright),ds
\&=int_0^1int_0^1A(s,y)B(t,1-s)C(x,1-t),dt,ds
\&stackreltextFubini=int_0^1int_0^1A(s,y)B(t,1-s)C(x,1-t),ds,dt
\&stackrelsleftrightarrow t=int_0^1int_0^1A(t,y)B(s,1-t)C(x,1-s),dt,ds
\&= ((AB)C)(x,y)
endalign
$$






share|cite|improve this answer





















    Your Answer




    StackExchange.ifUsing("editor", function ()
    return StackExchange.using("mathjaxEditing", function ()
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    );
    );
    , "mathjax-editing");

    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "69"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: false,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );








     

    draft saved


    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2856994%2fproving-that-uncountable-matrix-multiplication-is-associative%23new-answer', 'question_page');

    );

    Post as a guest






























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    1
    down vote













    Let $A,B,C:[0,1]^2to mathbb R$ be matrices.
    $$
    beginalign
    ((AB)C)(x,y)
    &= int_0^1 (AB)(s,y)C(x,1-s),ds
    \&=int_0^1left(int_0^1A(t,y)B(s,1-t),dtright)C(x,1-s),ds
    \&=int_0^1int_0^1A(t,y)B(s,1-t)C(x,1-s),dt,ds
    endalign$$$$
    beginalign
    (A(BC))(x,y)
    &= int_0^1 A(s,y)(BC)(x,1-s),ds
    \&=int_0^1A(s,y)left(int_0^1B(t,1-s)C(x,1-t),dtright),ds
    \&=int_0^1int_0^1A(s,y)B(t,1-s)C(x,1-t),dt,ds
    \&stackreltextFubini=int_0^1int_0^1A(s,y)B(t,1-s)C(x,1-t),ds,dt
    \&stackrelsleftrightarrow t=int_0^1int_0^1A(t,y)B(s,1-t)C(x,1-s),dt,ds
    \&= ((AB)C)(x,y)
    endalign
    $$






    share|cite|improve this answer

























      up vote
      1
      down vote













      Let $A,B,C:[0,1]^2to mathbb R$ be matrices.
      $$
      beginalign
      ((AB)C)(x,y)
      &= int_0^1 (AB)(s,y)C(x,1-s),ds
      \&=int_0^1left(int_0^1A(t,y)B(s,1-t),dtright)C(x,1-s),ds
      \&=int_0^1int_0^1A(t,y)B(s,1-t)C(x,1-s),dt,ds
      endalign$$$$
      beginalign
      (A(BC))(x,y)
      &= int_0^1 A(s,y)(BC)(x,1-s),ds
      \&=int_0^1A(s,y)left(int_0^1B(t,1-s)C(x,1-t),dtright),ds
      \&=int_0^1int_0^1A(s,y)B(t,1-s)C(x,1-t),dt,ds
      \&stackreltextFubini=int_0^1int_0^1A(s,y)B(t,1-s)C(x,1-t),ds,dt
      \&stackrelsleftrightarrow t=int_0^1int_0^1A(t,y)B(s,1-t)C(x,1-s),dt,ds
      \&= ((AB)C)(x,y)
      endalign
      $$






      share|cite|improve this answer























        up vote
        1
        down vote










        up vote
        1
        down vote









        Let $A,B,C:[0,1]^2to mathbb R$ be matrices.
        $$
        beginalign
        ((AB)C)(x,y)
        &= int_0^1 (AB)(s,y)C(x,1-s),ds
        \&=int_0^1left(int_0^1A(t,y)B(s,1-t),dtright)C(x,1-s),ds
        \&=int_0^1int_0^1A(t,y)B(s,1-t)C(x,1-s),dt,ds
        endalign$$$$
        beginalign
        (A(BC))(x,y)
        &= int_0^1 A(s,y)(BC)(x,1-s),ds
        \&=int_0^1A(s,y)left(int_0^1B(t,1-s)C(x,1-t),dtright),ds
        \&=int_0^1int_0^1A(s,y)B(t,1-s)C(x,1-t),dt,ds
        \&stackreltextFubini=int_0^1int_0^1A(s,y)B(t,1-s)C(x,1-t),ds,dt
        \&stackrelsleftrightarrow t=int_0^1int_0^1A(t,y)B(s,1-t)C(x,1-s),dt,ds
        \&= ((AB)C)(x,y)
        endalign
        $$






        share|cite|improve this answer













        Let $A,B,C:[0,1]^2to mathbb R$ be matrices.
        $$
        beginalign
        ((AB)C)(x,y)
        &= int_0^1 (AB)(s,y)C(x,1-s),ds
        \&=int_0^1left(int_0^1A(t,y)B(s,1-t),dtright)C(x,1-s),ds
        \&=int_0^1int_0^1A(t,y)B(s,1-t)C(x,1-s),dt,ds
        endalign$$$$
        beginalign
        (A(BC))(x,y)
        &= int_0^1 A(s,y)(BC)(x,1-s),ds
        \&=int_0^1A(s,y)left(int_0^1B(t,1-s)C(x,1-t),dtright),ds
        \&=int_0^1int_0^1A(s,y)B(t,1-s)C(x,1-t),dt,ds
        \&stackreltextFubini=int_0^1int_0^1A(s,y)B(t,1-s)C(x,1-t),ds,dt
        \&stackrelsleftrightarrow t=int_0^1int_0^1A(t,y)B(s,1-t)C(x,1-s),dt,ds
        \&= ((AB)C)(x,y)
        endalign
        $$







        share|cite|improve this answer













        share|cite|improve this answer



        share|cite|improve this answer











        answered Jul 19 at 22:05









        Mike Earnest

        15.3k11644




        15.3k11644






















             

            draft saved


            draft discarded


























             


            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2856994%2fproving-that-uncountable-matrix-multiplication-is-associative%23new-answer', 'question_page');

            );

            Post as a guest













































































            Comments

            Popular posts from this blog

            What is the equation of a 3D cone with generalised tilt?

            Color the edges and diagonals of a regular polygon

            Relationship between determinant of matrix and determinant of adjoint?