Proving that uncountable matrix multiplication is associative
Clash Royale CLAN TAG#URR8PPP
up vote
5
down vote
favorite
I have been looking for uncountable large structures that generalize the idea of matrix multiplication and noticed the following pattern
Given functions $M_1(x,y) , M_2(x,y)$ defined on the unit square $[0,1] times [0,1]$ one can define their "matrix product" as
$$ M_3(x,y) = int_0^1 M_1 (s, y) M_2 (x, 1-s) ds = ( M_1 M_2 )$$
If we try to approximate this integral with riemann sums, we find that the approximations are computationally equivalent to multiplying ever higher order matrices (to produce a third matrix), but it's not clear that associativity is preserved as we take the limit to forming an integral.
My work:
for 3 functions A,B,C I expressed $A(BC)$ and $(AB)C$ in terms of nested integrals but now am not clear on how to show that the 2 expressions are equal. One idea was to perturb the definition above to: $ int_-infty^infty M_1 (s, y) M_2 (x, -s) ds$ and then to show associativity for this simpler looking operation, [it then is easy to show that maps sending the open unit square to the whole plane, composed with this operation, and then said map's inverse is equivalent to the earlier operation]
linear-algebra multivariable-calculus
add a comment |Â
up vote
5
down vote
favorite
I have been looking for uncountable large structures that generalize the idea of matrix multiplication and noticed the following pattern
Given functions $M_1(x,y) , M_2(x,y)$ defined on the unit square $[0,1] times [0,1]$ one can define their "matrix product" as
$$ M_3(x,y) = int_0^1 M_1 (s, y) M_2 (x, 1-s) ds = ( M_1 M_2 )$$
If we try to approximate this integral with riemann sums, we find that the approximations are computationally equivalent to multiplying ever higher order matrices (to produce a third matrix), but it's not clear that associativity is preserved as we take the limit to forming an integral.
My work:
for 3 functions A,B,C I expressed $A(BC)$ and $(AB)C$ in terms of nested integrals but now am not clear on how to show that the 2 expressions are equal. One idea was to perturb the definition above to: $ int_-infty^infty M_1 (s, y) M_2 (x, -s) ds$ and then to show associativity for this simpler looking operation, [it then is easy to show that maps sending the open unit square to the whole plane, composed with this operation, and then said map's inverse is equivalent to the earlier operation]
linear-algebra multivariable-calculus
2
I think you could follow the same formal approach as showing matrix multiplication is associative, but use fubinis theorem when you need to switch the order of integrals (instead of sums)
– Nick Alger
Jul 19 at 19:45
Formal matrix multiplication I prove by showing 2x2 matrix multiplication is associative. Then use induction to show that $2^ntimes 2^n$ matrix multiplication is associative for all finite $n$.. Then show that general $ktimes k$ matrix multiplication can embed into $2^lceil log k rceil times 2^lceil log k rceil $ multiplication.
– frogeyedpeas
Jul 19 at 19:49
The countable structure to induct on here is what I lack. And some argument showing that convergent Riemann sums of $A(BC)$ and $(AB)C$ are equivalent feels too tedious to do.
– frogeyedpeas
Jul 19 at 19:51
It might also work to view $M_1$ as defining a linear operator $C[0,1]to C[0,1]$, $f mapsto left(y mapsto int_0^1 M_1(t, y) f(1-t) , dtright)$ and then show that this matrix multiplication corresponds to composition. Then it would remain to show this mapping of "matrices" to linear operators is injective. (Of course, to fully reproduce the usual argument along these lines from the finite case, we might need to extend the domain of the linear operators to include distributions like $delta(x - x_0)$.)
– Daniel Schepler
Jul 19 at 20:15
add a comment |Â
up vote
5
down vote
favorite
up vote
5
down vote
favorite
I have been looking for uncountable large structures that generalize the idea of matrix multiplication and noticed the following pattern
Given functions $M_1(x,y) , M_2(x,y)$ defined on the unit square $[0,1] times [0,1]$ one can define their "matrix product" as
$$ M_3(x,y) = int_0^1 M_1 (s, y) M_2 (x, 1-s) ds = ( M_1 M_2 )$$
If we try to approximate this integral with riemann sums, we find that the approximations are computationally equivalent to multiplying ever higher order matrices (to produce a third matrix), but it's not clear that associativity is preserved as we take the limit to forming an integral.
My work:
for 3 functions A,B,C I expressed $A(BC)$ and $(AB)C$ in terms of nested integrals but now am not clear on how to show that the 2 expressions are equal. One idea was to perturb the definition above to: $ int_-infty^infty M_1 (s, y) M_2 (x, -s) ds$ and then to show associativity for this simpler looking operation, [it then is easy to show that maps sending the open unit square to the whole plane, composed with this operation, and then said map's inverse is equivalent to the earlier operation]
linear-algebra multivariable-calculus
I have been looking for uncountable large structures that generalize the idea of matrix multiplication and noticed the following pattern
Given functions $M_1(x,y) , M_2(x,y)$ defined on the unit square $[0,1] times [0,1]$ one can define their "matrix product" as
$$ M_3(x,y) = int_0^1 M_1 (s, y) M_2 (x, 1-s) ds = ( M_1 M_2 )$$
If we try to approximate this integral with riemann sums, we find that the approximations are computationally equivalent to multiplying ever higher order matrices (to produce a third matrix), but it's not clear that associativity is preserved as we take the limit to forming an integral.
My work:
for 3 functions A,B,C I expressed $A(BC)$ and $(AB)C$ in terms of nested integrals but now am not clear on how to show that the 2 expressions are equal. One idea was to perturb the definition above to: $ int_-infty^infty M_1 (s, y) M_2 (x, -s) ds$ and then to show associativity for this simpler looking operation, [it then is easy to show that maps sending the open unit square to the whole plane, composed with this operation, and then said map's inverse is equivalent to the earlier operation]
linear-algebra multivariable-calculus
asked Jul 19 at 19:38


frogeyedpeas
6,96271747
6,96271747
2
I think you could follow the same formal approach as showing matrix multiplication is associative, but use fubinis theorem when you need to switch the order of integrals (instead of sums)
– Nick Alger
Jul 19 at 19:45
Formal matrix multiplication I prove by showing 2x2 matrix multiplication is associative. Then use induction to show that $2^ntimes 2^n$ matrix multiplication is associative for all finite $n$.. Then show that general $ktimes k$ matrix multiplication can embed into $2^lceil log k rceil times 2^lceil log k rceil $ multiplication.
– frogeyedpeas
Jul 19 at 19:49
The countable structure to induct on here is what I lack. And some argument showing that convergent Riemann sums of $A(BC)$ and $(AB)C$ are equivalent feels too tedious to do.
– frogeyedpeas
Jul 19 at 19:51
It might also work to view $M_1$ as defining a linear operator $C[0,1]to C[0,1]$, $f mapsto left(y mapsto int_0^1 M_1(t, y) f(1-t) , dtright)$ and then show that this matrix multiplication corresponds to composition. Then it would remain to show this mapping of "matrices" to linear operators is injective. (Of course, to fully reproduce the usual argument along these lines from the finite case, we might need to extend the domain of the linear operators to include distributions like $delta(x - x_0)$.)
– Daniel Schepler
Jul 19 at 20:15
add a comment |Â
2
I think you could follow the same formal approach as showing matrix multiplication is associative, but use fubinis theorem when you need to switch the order of integrals (instead of sums)
– Nick Alger
Jul 19 at 19:45
Formal matrix multiplication I prove by showing 2x2 matrix multiplication is associative. Then use induction to show that $2^ntimes 2^n$ matrix multiplication is associative for all finite $n$.. Then show that general $ktimes k$ matrix multiplication can embed into $2^lceil log k rceil times 2^lceil log k rceil $ multiplication.
– frogeyedpeas
Jul 19 at 19:49
The countable structure to induct on here is what I lack. And some argument showing that convergent Riemann sums of $A(BC)$ and $(AB)C$ are equivalent feels too tedious to do.
– frogeyedpeas
Jul 19 at 19:51
It might also work to view $M_1$ as defining a linear operator $C[0,1]to C[0,1]$, $f mapsto left(y mapsto int_0^1 M_1(t, y) f(1-t) , dtright)$ and then show that this matrix multiplication corresponds to composition. Then it would remain to show this mapping of "matrices" to linear operators is injective. (Of course, to fully reproduce the usual argument along these lines from the finite case, we might need to extend the domain of the linear operators to include distributions like $delta(x - x_0)$.)
– Daniel Schepler
Jul 19 at 20:15
2
2
I think you could follow the same formal approach as showing matrix multiplication is associative, but use fubinis theorem when you need to switch the order of integrals (instead of sums)
– Nick Alger
Jul 19 at 19:45
I think you could follow the same formal approach as showing matrix multiplication is associative, but use fubinis theorem when you need to switch the order of integrals (instead of sums)
– Nick Alger
Jul 19 at 19:45
Formal matrix multiplication I prove by showing 2x2 matrix multiplication is associative. Then use induction to show that $2^ntimes 2^n$ matrix multiplication is associative for all finite $n$.. Then show that general $ktimes k$ matrix multiplication can embed into $2^lceil log k rceil times 2^lceil log k rceil $ multiplication.
– frogeyedpeas
Jul 19 at 19:49
Formal matrix multiplication I prove by showing 2x2 matrix multiplication is associative. Then use induction to show that $2^ntimes 2^n$ matrix multiplication is associative for all finite $n$.. Then show that general $ktimes k$ matrix multiplication can embed into $2^lceil log k rceil times 2^lceil log k rceil $ multiplication.
– frogeyedpeas
Jul 19 at 19:49
The countable structure to induct on here is what I lack. And some argument showing that convergent Riemann sums of $A(BC)$ and $(AB)C$ are equivalent feels too tedious to do.
– frogeyedpeas
Jul 19 at 19:51
The countable structure to induct on here is what I lack. And some argument showing that convergent Riemann sums of $A(BC)$ and $(AB)C$ are equivalent feels too tedious to do.
– frogeyedpeas
Jul 19 at 19:51
It might also work to view $M_1$ as defining a linear operator $C[0,1]to C[0,1]$, $f mapsto left(y mapsto int_0^1 M_1(t, y) f(1-t) , dtright)$ and then show that this matrix multiplication corresponds to composition. Then it would remain to show this mapping of "matrices" to linear operators is injective. (Of course, to fully reproduce the usual argument along these lines from the finite case, we might need to extend the domain of the linear operators to include distributions like $delta(x - x_0)$.)
– Daniel Schepler
Jul 19 at 20:15
It might also work to view $M_1$ as defining a linear operator $C[0,1]to C[0,1]$, $f mapsto left(y mapsto int_0^1 M_1(t, y) f(1-t) , dtright)$ and then show that this matrix multiplication corresponds to composition. Then it would remain to show this mapping of "matrices" to linear operators is injective. (Of course, to fully reproduce the usual argument along these lines from the finite case, we might need to extend the domain of the linear operators to include distributions like $delta(x - x_0)$.)
– Daniel Schepler
Jul 19 at 20:15
add a comment |Â
1 Answer
1
active
oldest
votes
up vote
1
down vote
Let $A,B,C:[0,1]^2to mathbb R$ be matrices.
$$
beginalign
((AB)C)(x,y)
&= int_0^1 (AB)(s,y)C(x,1-s),ds
\&=int_0^1left(int_0^1A(t,y)B(s,1-t),dtright)C(x,1-s),ds
\&=int_0^1int_0^1A(t,y)B(s,1-t)C(x,1-s),dt,ds
endalign$$$$
beginalign
(A(BC))(x,y)
&= int_0^1 A(s,y)(BC)(x,1-s),ds
\&=int_0^1A(s,y)left(int_0^1B(t,1-s)C(x,1-t),dtright),ds
\&=int_0^1int_0^1A(s,y)B(t,1-s)C(x,1-t),dt,ds
\&stackreltextFubini=int_0^1int_0^1A(s,y)B(t,1-s)C(x,1-t),ds,dt
\&stackrelsleftrightarrow t=int_0^1int_0^1A(t,y)B(s,1-t)C(x,1-s),dt,ds
\&= ((AB)C)(x,y)
endalign
$$
add a comment |Â
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
Let $A,B,C:[0,1]^2to mathbb R$ be matrices.
$$
beginalign
((AB)C)(x,y)
&= int_0^1 (AB)(s,y)C(x,1-s),ds
\&=int_0^1left(int_0^1A(t,y)B(s,1-t),dtright)C(x,1-s),ds
\&=int_0^1int_0^1A(t,y)B(s,1-t)C(x,1-s),dt,ds
endalign$$$$
beginalign
(A(BC))(x,y)
&= int_0^1 A(s,y)(BC)(x,1-s),ds
\&=int_0^1A(s,y)left(int_0^1B(t,1-s)C(x,1-t),dtright),ds
\&=int_0^1int_0^1A(s,y)B(t,1-s)C(x,1-t),dt,ds
\&stackreltextFubini=int_0^1int_0^1A(s,y)B(t,1-s)C(x,1-t),ds,dt
\&stackrelsleftrightarrow t=int_0^1int_0^1A(t,y)B(s,1-t)C(x,1-s),dt,ds
\&= ((AB)C)(x,y)
endalign
$$
add a comment |Â
up vote
1
down vote
Let $A,B,C:[0,1]^2to mathbb R$ be matrices.
$$
beginalign
((AB)C)(x,y)
&= int_0^1 (AB)(s,y)C(x,1-s),ds
\&=int_0^1left(int_0^1A(t,y)B(s,1-t),dtright)C(x,1-s),ds
\&=int_0^1int_0^1A(t,y)B(s,1-t)C(x,1-s),dt,ds
endalign$$$$
beginalign
(A(BC))(x,y)
&= int_0^1 A(s,y)(BC)(x,1-s),ds
\&=int_0^1A(s,y)left(int_0^1B(t,1-s)C(x,1-t),dtright),ds
\&=int_0^1int_0^1A(s,y)B(t,1-s)C(x,1-t),dt,ds
\&stackreltextFubini=int_0^1int_0^1A(s,y)B(t,1-s)C(x,1-t),ds,dt
\&stackrelsleftrightarrow t=int_0^1int_0^1A(t,y)B(s,1-t)C(x,1-s),dt,ds
\&= ((AB)C)(x,y)
endalign
$$
add a comment |Â
up vote
1
down vote
up vote
1
down vote
Let $A,B,C:[0,1]^2to mathbb R$ be matrices.
$$
beginalign
((AB)C)(x,y)
&= int_0^1 (AB)(s,y)C(x,1-s),ds
\&=int_0^1left(int_0^1A(t,y)B(s,1-t),dtright)C(x,1-s),ds
\&=int_0^1int_0^1A(t,y)B(s,1-t)C(x,1-s),dt,ds
endalign$$$$
beginalign
(A(BC))(x,y)
&= int_0^1 A(s,y)(BC)(x,1-s),ds
\&=int_0^1A(s,y)left(int_0^1B(t,1-s)C(x,1-t),dtright),ds
\&=int_0^1int_0^1A(s,y)B(t,1-s)C(x,1-t),dt,ds
\&stackreltextFubini=int_0^1int_0^1A(s,y)B(t,1-s)C(x,1-t),ds,dt
\&stackrelsleftrightarrow t=int_0^1int_0^1A(t,y)B(s,1-t)C(x,1-s),dt,ds
\&= ((AB)C)(x,y)
endalign
$$
Let $A,B,C:[0,1]^2to mathbb R$ be matrices.
$$
beginalign
((AB)C)(x,y)
&= int_0^1 (AB)(s,y)C(x,1-s),ds
\&=int_0^1left(int_0^1A(t,y)B(s,1-t),dtright)C(x,1-s),ds
\&=int_0^1int_0^1A(t,y)B(s,1-t)C(x,1-s),dt,ds
endalign$$$$
beginalign
(A(BC))(x,y)
&= int_0^1 A(s,y)(BC)(x,1-s),ds
\&=int_0^1A(s,y)left(int_0^1B(t,1-s)C(x,1-t),dtright),ds
\&=int_0^1int_0^1A(s,y)B(t,1-s)C(x,1-t),dt,ds
\&stackreltextFubini=int_0^1int_0^1A(s,y)B(t,1-s)C(x,1-t),ds,dt
\&stackrelsleftrightarrow t=int_0^1int_0^1A(t,y)B(s,1-t)C(x,1-s),dt,ds
\&= ((AB)C)(x,y)
endalign
$$
answered Jul 19 at 22:05


Mike Earnest
15.3k11644
15.3k11644
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2856994%2fproving-that-uncountable-matrix-multiplication-is-associative%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
2
I think you could follow the same formal approach as showing matrix multiplication is associative, but use fubinis theorem when you need to switch the order of integrals (instead of sums)
– Nick Alger
Jul 19 at 19:45
Formal matrix multiplication I prove by showing 2x2 matrix multiplication is associative. Then use induction to show that $2^ntimes 2^n$ matrix multiplication is associative for all finite $n$.. Then show that general $ktimes k$ matrix multiplication can embed into $2^lceil log k rceil times 2^lceil log k rceil $ multiplication.
– frogeyedpeas
Jul 19 at 19:49
The countable structure to induct on here is what I lack. And some argument showing that convergent Riemann sums of $A(BC)$ and $(AB)C$ are equivalent feels too tedious to do.
– frogeyedpeas
Jul 19 at 19:51
It might also work to view $M_1$ as defining a linear operator $C[0,1]to C[0,1]$, $f mapsto left(y mapsto int_0^1 M_1(t, y) f(1-t) , dtright)$ and then show that this matrix multiplication corresponds to composition. Then it would remain to show this mapping of "matrices" to linear operators is injective. (Of course, to fully reproduce the usual argument along these lines from the finite case, we might need to extend the domain of the linear operators to include distributions like $delta(x - x_0)$.)
– Daniel Schepler
Jul 19 at 20:15