Linear algebra question with rank-1 matrices

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
1
down vote

favorite
1












Suppose I have a column vector $v in mathbbR^n$ and I want to find solutions $X in mathbbR^ntimes n$ such that
$$
X^T v v^T X = v v^T
$$
The solutions I found are
$$
X = fracv v^Tv^T v, qquad X = pm I
$$
However, suppose I have two column vectors $u,v in mathbbR^n$ which may be assumed to be orthogonal ($u^T v = 0$). Then do there exist solutions $X$ to
$$
X^T left( u u^T + v v^T right) X = u u^T + v v^T
$$
Do any solutions exist besides the trivial $X = pm I$?







share|cite|improve this question



















  • I don't understand how you obtain $$X^T v v^T X = v v^Timplies X=pm I$$
    – gimusi
    Jul 30 at 21:25










  • @gimusi The OP meant that $X=pm I$ are solutions to $X^top v v^top X=v v^top$.
    – Batominovski
    Jul 30 at 21:25











  • @Batominovski But it is not the only solution I guess, we can take also $X^T=$ a projection matrix onto $v$.
    – gimusi
    Jul 30 at 21:27










  • @gimusi The OP never claimed that whatever he found were the only solutions. Plus, if $X$ is a solution, then $X+N^top$ is a solution, where $N$ is a matrix with $vinker(N)$.
    – Batominovski
    Jul 30 at 21:29











  • @Batominovski: Since $u u^T + v v^T$ is rank 2 and therefore not invertible for $n > 2$, does this mean I can't write down a closed form expression for its pseudo inverse?
    – vibe
    Jul 30 at 21:29














up vote
1
down vote

favorite
1












Suppose I have a column vector $v in mathbbR^n$ and I want to find solutions $X in mathbbR^ntimes n$ such that
$$
X^T v v^T X = v v^T
$$
The solutions I found are
$$
X = fracv v^Tv^T v, qquad X = pm I
$$
However, suppose I have two column vectors $u,v in mathbbR^n$ which may be assumed to be orthogonal ($u^T v = 0$). Then do there exist solutions $X$ to
$$
X^T left( u u^T + v v^T right) X = u u^T + v v^T
$$
Do any solutions exist besides the trivial $X = pm I$?







share|cite|improve this question



















  • I don't understand how you obtain $$X^T v v^T X = v v^Timplies X=pm I$$
    – gimusi
    Jul 30 at 21:25










  • @gimusi The OP meant that $X=pm I$ are solutions to $X^top v v^top X=v v^top$.
    – Batominovski
    Jul 30 at 21:25











  • @Batominovski But it is not the only solution I guess, we can take also $X^T=$ a projection matrix onto $v$.
    – gimusi
    Jul 30 at 21:27










  • @gimusi The OP never claimed that whatever he found were the only solutions. Plus, if $X$ is a solution, then $X+N^top$ is a solution, where $N$ is a matrix with $vinker(N)$.
    – Batominovski
    Jul 30 at 21:29











  • @Batominovski: Since $u u^T + v v^T$ is rank 2 and therefore not invertible for $n > 2$, does this mean I can't write down a closed form expression for its pseudo inverse?
    – vibe
    Jul 30 at 21:29












up vote
1
down vote

favorite
1









up vote
1
down vote

favorite
1






1





Suppose I have a column vector $v in mathbbR^n$ and I want to find solutions $X in mathbbR^ntimes n$ such that
$$
X^T v v^T X = v v^T
$$
The solutions I found are
$$
X = fracv v^Tv^T v, qquad X = pm I
$$
However, suppose I have two column vectors $u,v in mathbbR^n$ which may be assumed to be orthogonal ($u^T v = 0$). Then do there exist solutions $X$ to
$$
X^T left( u u^T + v v^T right) X = u u^T + v v^T
$$
Do any solutions exist besides the trivial $X = pm I$?







share|cite|improve this question











Suppose I have a column vector $v in mathbbR^n$ and I want to find solutions $X in mathbbR^ntimes n$ such that
$$
X^T v v^T X = v v^T
$$
The solutions I found are
$$
X = fracv v^Tv^T v, qquad X = pm I
$$
However, suppose I have two column vectors $u,v in mathbbR^n$ which may be assumed to be orthogonal ($u^T v = 0$). Then do there exist solutions $X$ to
$$
X^T left( u u^T + v v^T right) X = u u^T + v v^T
$$
Do any solutions exist besides the trivial $X = pm I$?









share|cite|improve this question










share|cite|improve this question




share|cite|improve this question









asked Jul 30 at 21:21









vibe

1388




1388











  • I don't understand how you obtain $$X^T v v^T X = v v^Timplies X=pm I$$
    – gimusi
    Jul 30 at 21:25










  • @gimusi The OP meant that $X=pm I$ are solutions to $X^top v v^top X=v v^top$.
    – Batominovski
    Jul 30 at 21:25











  • @Batominovski But it is not the only solution I guess, we can take also $X^T=$ a projection matrix onto $v$.
    – gimusi
    Jul 30 at 21:27










  • @gimusi The OP never claimed that whatever he found were the only solutions. Plus, if $X$ is a solution, then $X+N^top$ is a solution, where $N$ is a matrix with $vinker(N)$.
    – Batominovski
    Jul 30 at 21:29











  • @Batominovski: Since $u u^T + v v^T$ is rank 2 and therefore not invertible for $n > 2$, does this mean I can't write down a closed form expression for its pseudo inverse?
    – vibe
    Jul 30 at 21:29
















  • I don't understand how you obtain $$X^T v v^T X = v v^Timplies X=pm I$$
    – gimusi
    Jul 30 at 21:25










  • @gimusi The OP meant that $X=pm I$ are solutions to $X^top v v^top X=v v^top$.
    – Batominovski
    Jul 30 at 21:25











  • @Batominovski But it is not the only solution I guess, we can take also $X^T=$ a projection matrix onto $v$.
    – gimusi
    Jul 30 at 21:27










  • @gimusi The OP never claimed that whatever he found were the only solutions. Plus, if $X$ is a solution, then $X+N^top$ is a solution, where $N$ is a matrix with $vinker(N)$.
    – Batominovski
    Jul 30 at 21:29











  • @Batominovski: Since $u u^T + v v^T$ is rank 2 and therefore not invertible for $n > 2$, does this mean I can't write down a closed form expression for its pseudo inverse?
    – vibe
    Jul 30 at 21:29















I don't understand how you obtain $$X^T v v^T X = v v^Timplies X=pm I$$
– gimusi
Jul 30 at 21:25




I don't understand how you obtain $$X^T v v^T X = v v^Timplies X=pm I$$
– gimusi
Jul 30 at 21:25












@gimusi The OP meant that $X=pm I$ are solutions to $X^top v v^top X=v v^top$.
– Batominovski
Jul 30 at 21:25





@gimusi The OP meant that $X=pm I$ are solutions to $X^top v v^top X=v v^top$.
– Batominovski
Jul 30 at 21:25













@Batominovski But it is not the only solution I guess, we can take also $X^T=$ a projection matrix onto $v$.
– gimusi
Jul 30 at 21:27




@Batominovski But it is not the only solution I guess, we can take also $X^T=$ a projection matrix onto $v$.
– gimusi
Jul 30 at 21:27












@gimusi The OP never claimed that whatever he found were the only solutions. Plus, if $X$ is a solution, then $X+N^top$ is a solution, where $N$ is a matrix with $vinker(N)$.
– Batominovski
Jul 30 at 21:29





@gimusi The OP never claimed that whatever he found were the only solutions. Plus, if $X$ is a solution, then $X+N^top$ is a solution, where $N$ is a matrix with $vinker(N)$.
– Batominovski
Jul 30 at 21:29













@Batominovski: Since $u u^T + v v^T$ is rank 2 and therefore not invertible for $n > 2$, does this mean I can't write down a closed form expression for its pseudo inverse?
– vibe
Jul 30 at 21:29




@Batominovski: Since $u u^T + v v^T$ is rank 2 and therefore not invertible for $n > 2$, does this mean I can't write down a closed form expression for its pseudo inverse?
– vibe
Jul 30 at 21:29










2 Answers
2






active

oldest

votes

















up vote
1
down vote



accepted










Set
$$A:=fracu,u^topu^top u+fracv,v^topv^top v,.$$
Because $u^top ,v=0$ and $v^top,u=0$, as $u$ and $v$ are orthogonal, we conclude that $X=A$ is a solution to
$$X^top,left(u,u^top+v,v^topright),X=u,u^top+v,v^top,.$$



If $N$ is an $n$-by-$n$ matrix such that $u$ and $v$ are in $ker(N)$, then $X=A+N^top$ is also a solution. For $ngeq 3$, there are infinitely many such $N$. Furthermore, $X=I+N^top$ is also a nontrivial solution.




For a fixed positive integer $mleq n$, let $u_1,u_2,ldots,u_m$ be (not necessarily orthogonal) linearly independent elements of $mathbbR^n$. Write $$J:=sum_j=1^m,u_j,u_j^top,.$$
I shall find all solutions $XintextMat_ntimes n(mathbbR)$ such that
$$X^top , J ,X=J,.$$



Let $e_1,e_2,ldots,e_n$ be the standard basis vectors of $mathbbR^n$. Choose an invertible matrix $VintextGL_n(mathbbR)$ such that $V,u_j=e_j$ for $j=1,2,ldots,m$. Define the matrix $K$ to be $$K:=sum_j=1^m,e_j,e_j^top=V,left(sum_j=1^m,u_j,u_j^topright),V^top=V,J,V^top,.$$
Setting $Y:=V^-1,X,V^top$, we see that $X^top,J,X=J$ if and only if $$Y^top,K,Y=K,.$$



Write
$$Y=beginbmatrixP&Q\R&Sendbmatrix,,$$
where $PintextMat_mtimes m(mathbbR)$, $QintextMat_mtimes (n-m)(mathbbR)$, $RintextMat_(n-m)times m(mathbbR)$, and $SintextMat_(n-m)times(n-m)(mathbbR)$. Also, $$K=beginbmatrixI_mtimes m&0_mtimes (n-m)\0_(n-m)times m&0_mtimes mendbmatrix,,$$
where $I_ktimes k$ is the $k$-by-$k$ identity matrix and $0_rtimes s$ is the $r$-by-$s$ zero matrix.
Thus, the condition $Y^top,K,Y=K$ means that
$$beginalign
beginbmatrixI_mtimes m&0_mtimes (n-m)\0_(n-m)times m&0_mtimes mendbmatrix
&=K=Y^top,K,Y
\
&=
beginbmatrix P^top& R^top\Q^top &S^topendbmatrixbeginbmatrixI_mtimes m&0_mtimes (n-m)\0_(n-m)times m&0_mtimes mendbmatrixbeginbmatrixP&Q\ R&Sendbmatrix
\
&=beginbmatrix P^top& R^top\Q^top &S^topendbmatrixbeginbmatrix P&Q\ 0_(n-m)times m&0_mtimes mendbmatrix=beginbmatrixP^top, P&P^top, Q\ Q^top, P &Q^top ,Qendbmatrix
endalign,.$$
That is, $Q^top, Q=0_(n-m)times (n-m)$. This shows that $Q=0_(n-m)times (n-m)$. Furthermore, we have
$$P^top,P =I_mtimes mtext or Pin O_m(mathbbR),.$$
There are no other conditions on $R$ and $S$.



Since $X=V,Y,left(V^-1right)^top$, we conclude that all solutions $X$ to $X^top,J,X=J$ take the form
$$V,beginbmatrixP&0_mtimes(n-m)\R&Sendbmatrix,left(V^-1right)^top,,$$
where $Pin O_m(mathbbR)$ is a real orthogonal matrix, $RintextMat_(n-m)times n(mathbbR)$, and $SintextMat_(n-m)times (n-m)(mathbbR)$.

The solution set is therefore a smooth real manifold, which is diffeomorphic to
$$O_m(mathbbR)times mathbbR^m(n-m)+(n-m)^2,.$$
Hence, the solution set is of dimension $$fracm(m-1)2+m(n-m)+(n-m)^2=n(n-m)+fracm(m-1)2geq fracn(n-1)2,.$$
There are therefore infinitely many nontrivial solutions when $ngeq2$.






share|cite|improve this answer























  • Then can we conclude that all the solution are the projection matrix onto span(u,v)?
    – gimusi
    Jul 30 at 21:50










  • This is great, and would generalize nicely to any number of rank-1 matrices. I am also curious what other solutions could exist.
    – vibe
    Jul 30 at 21:59










  • Interesting solution. Since in this case the $u_j$ are orthogonal, I guess you can choose $V = I$ to simplify the solution a little.
    – vibe
    Jul 31 at 17:26










  • Also in the case $m = 1$, we know that $X = u u^T / (u^T u)$ is a solution, but this would not necessarily have $0$ in the upper right block of the matrix?
    – vibe
    Jul 31 at 17:29










  • You can't always choose $V=I$ even when the $u_j$'s are orthogonal. Work on the small cases to see that $V=I$ doesn't usually work.
    – Batominovski
    Jul 31 at 17:32

















up vote
0
down vote













With reference to



$$X^T v v^T X = v v^T$$



assuming $X^T=fracvv^T$ that is a projection matrix onto $operatornamespan(v)$ we have



$$X^T v v^T X = fracvv^Tv v^Tfracvv^T=vv^T$$



For



$$X^T left( u u^T + v v^T right) X = u u^T + v v^T$$



we can assume $X^T$ as a projection matrix onto $operatornamespan(v,u)$.






share|cite|improve this answer























    Your Answer




    StackExchange.ifUsing("editor", function ()
    return StackExchange.using("mathjaxEditing", function ()
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    );
    );
    , "mathjax-editing");

    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "69"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: false,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );








     

    draft saved


    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2867435%2flinear-algebra-question-with-rank-1-matrices%23new-answer', 'question_page');

    );

    Post as a guest






























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    1
    down vote



    accepted










    Set
    $$A:=fracu,u^topu^top u+fracv,v^topv^top v,.$$
    Because $u^top ,v=0$ and $v^top,u=0$, as $u$ and $v$ are orthogonal, we conclude that $X=A$ is a solution to
    $$X^top,left(u,u^top+v,v^topright),X=u,u^top+v,v^top,.$$



    If $N$ is an $n$-by-$n$ matrix such that $u$ and $v$ are in $ker(N)$, then $X=A+N^top$ is also a solution. For $ngeq 3$, there are infinitely many such $N$. Furthermore, $X=I+N^top$ is also a nontrivial solution.




    For a fixed positive integer $mleq n$, let $u_1,u_2,ldots,u_m$ be (not necessarily orthogonal) linearly independent elements of $mathbbR^n$. Write $$J:=sum_j=1^m,u_j,u_j^top,.$$
    I shall find all solutions $XintextMat_ntimes n(mathbbR)$ such that
    $$X^top , J ,X=J,.$$



    Let $e_1,e_2,ldots,e_n$ be the standard basis vectors of $mathbbR^n$. Choose an invertible matrix $VintextGL_n(mathbbR)$ such that $V,u_j=e_j$ for $j=1,2,ldots,m$. Define the matrix $K$ to be $$K:=sum_j=1^m,e_j,e_j^top=V,left(sum_j=1^m,u_j,u_j^topright),V^top=V,J,V^top,.$$
    Setting $Y:=V^-1,X,V^top$, we see that $X^top,J,X=J$ if and only if $$Y^top,K,Y=K,.$$



    Write
    $$Y=beginbmatrixP&Q\R&Sendbmatrix,,$$
    where $PintextMat_mtimes m(mathbbR)$, $QintextMat_mtimes (n-m)(mathbbR)$, $RintextMat_(n-m)times m(mathbbR)$, and $SintextMat_(n-m)times(n-m)(mathbbR)$. Also, $$K=beginbmatrixI_mtimes m&0_mtimes (n-m)\0_(n-m)times m&0_mtimes mendbmatrix,,$$
    where $I_ktimes k$ is the $k$-by-$k$ identity matrix and $0_rtimes s$ is the $r$-by-$s$ zero matrix.
    Thus, the condition $Y^top,K,Y=K$ means that
    $$beginalign
    beginbmatrixI_mtimes m&0_mtimes (n-m)\0_(n-m)times m&0_mtimes mendbmatrix
    &=K=Y^top,K,Y
    \
    &=
    beginbmatrix P^top& R^top\Q^top &S^topendbmatrixbeginbmatrixI_mtimes m&0_mtimes (n-m)\0_(n-m)times m&0_mtimes mendbmatrixbeginbmatrixP&Q\ R&Sendbmatrix
    \
    &=beginbmatrix P^top& R^top\Q^top &S^topendbmatrixbeginbmatrix P&Q\ 0_(n-m)times m&0_mtimes mendbmatrix=beginbmatrixP^top, P&P^top, Q\ Q^top, P &Q^top ,Qendbmatrix
    endalign,.$$
    That is, $Q^top, Q=0_(n-m)times (n-m)$. This shows that $Q=0_(n-m)times (n-m)$. Furthermore, we have
    $$P^top,P =I_mtimes mtext or Pin O_m(mathbbR),.$$
    There are no other conditions on $R$ and $S$.



    Since $X=V,Y,left(V^-1right)^top$, we conclude that all solutions $X$ to $X^top,J,X=J$ take the form
    $$V,beginbmatrixP&0_mtimes(n-m)\R&Sendbmatrix,left(V^-1right)^top,,$$
    where $Pin O_m(mathbbR)$ is a real orthogonal matrix, $RintextMat_(n-m)times n(mathbbR)$, and $SintextMat_(n-m)times (n-m)(mathbbR)$.

    The solution set is therefore a smooth real manifold, which is diffeomorphic to
    $$O_m(mathbbR)times mathbbR^m(n-m)+(n-m)^2,.$$
    Hence, the solution set is of dimension $$fracm(m-1)2+m(n-m)+(n-m)^2=n(n-m)+fracm(m-1)2geq fracn(n-1)2,.$$
    There are therefore infinitely many nontrivial solutions when $ngeq2$.






    share|cite|improve this answer























    • Then can we conclude that all the solution are the projection matrix onto span(u,v)?
      – gimusi
      Jul 30 at 21:50










    • This is great, and would generalize nicely to any number of rank-1 matrices. I am also curious what other solutions could exist.
      – vibe
      Jul 30 at 21:59










    • Interesting solution. Since in this case the $u_j$ are orthogonal, I guess you can choose $V = I$ to simplify the solution a little.
      – vibe
      Jul 31 at 17:26










    • Also in the case $m = 1$, we know that $X = u u^T / (u^T u)$ is a solution, but this would not necessarily have $0$ in the upper right block of the matrix?
      – vibe
      Jul 31 at 17:29










    • You can't always choose $V=I$ even when the $u_j$'s are orthogonal. Work on the small cases to see that $V=I$ doesn't usually work.
      – Batominovski
      Jul 31 at 17:32














    up vote
    1
    down vote



    accepted










    Set
    $$A:=fracu,u^topu^top u+fracv,v^topv^top v,.$$
    Because $u^top ,v=0$ and $v^top,u=0$, as $u$ and $v$ are orthogonal, we conclude that $X=A$ is a solution to
    $$X^top,left(u,u^top+v,v^topright),X=u,u^top+v,v^top,.$$



    If $N$ is an $n$-by-$n$ matrix such that $u$ and $v$ are in $ker(N)$, then $X=A+N^top$ is also a solution. For $ngeq 3$, there are infinitely many such $N$. Furthermore, $X=I+N^top$ is also a nontrivial solution.




    For a fixed positive integer $mleq n$, let $u_1,u_2,ldots,u_m$ be (not necessarily orthogonal) linearly independent elements of $mathbbR^n$. Write $$J:=sum_j=1^m,u_j,u_j^top,.$$
    I shall find all solutions $XintextMat_ntimes n(mathbbR)$ such that
    $$X^top , J ,X=J,.$$



    Let $e_1,e_2,ldots,e_n$ be the standard basis vectors of $mathbbR^n$. Choose an invertible matrix $VintextGL_n(mathbbR)$ such that $V,u_j=e_j$ for $j=1,2,ldots,m$. Define the matrix $K$ to be $$K:=sum_j=1^m,e_j,e_j^top=V,left(sum_j=1^m,u_j,u_j^topright),V^top=V,J,V^top,.$$
    Setting $Y:=V^-1,X,V^top$, we see that $X^top,J,X=J$ if and only if $$Y^top,K,Y=K,.$$



    Write
    $$Y=beginbmatrixP&Q\R&Sendbmatrix,,$$
    where $PintextMat_mtimes m(mathbbR)$, $QintextMat_mtimes (n-m)(mathbbR)$, $RintextMat_(n-m)times m(mathbbR)$, and $SintextMat_(n-m)times(n-m)(mathbbR)$. Also, $$K=beginbmatrixI_mtimes m&0_mtimes (n-m)\0_(n-m)times m&0_mtimes mendbmatrix,,$$
    where $I_ktimes k$ is the $k$-by-$k$ identity matrix and $0_rtimes s$ is the $r$-by-$s$ zero matrix.
    Thus, the condition $Y^top,K,Y=K$ means that
    $$beginalign
    beginbmatrixI_mtimes m&0_mtimes (n-m)\0_(n-m)times m&0_mtimes mendbmatrix
    &=K=Y^top,K,Y
    \
    &=
    beginbmatrix P^top& R^top\Q^top &S^topendbmatrixbeginbmatrixI_mtimes m&0_mtimes (n-m)\0_(n-m)times m&0_mtimes mendbmatrixbeginbmatrixP&Q\ R&Sendbmatrix
    \
    &=beginbmatrix P^top& R^top\Q^top &S^topendbmatrixbeginbmatrix P&Q\ 0_(n-m)times m&0_mtimes mendbmatrix=beginbmatrixP^top, P&P^top, Q\ Q^top, P &Q^top ,Qendbmatrix
    endalign,.$$
    That is, $Q^top, Q=0_(n-m)times (n-m)$. This shows that $Q=0_(n-m)times (n-m)$. Furthermore, we have
    $$P^top,P =I_mtimes mtext or Pin O_m(mathbbR),.$$
    There are no other conditions on $R$ and $S$.



    Since $X=V,Y,left(V^-1right)^top$, we conclude that all solutions $X$ to $X^top,J,X=J$ take the form
    $$V,beginbmatrixP&0_mtimes(n-m)\R&Sendbmatrix,left(V^-1right)^top,,$$
    where $Pin O_m(mathbbR)$ is a real orthogonal matrix, $RintextMat_(n-m)times n(mathbbR)$, and $SintextMat_(n-m)times (n-m)(mathbbR)$.

    The solution set is therefore a smooth real manifold, which is diffeomorphic to
    $$O_m(mathbbR)times mathbbR^m(n-m)+(n-m)^2,.$$
    Hence, the solution set is of dimension $$fracm(m-1)2+m(n-m)+(n-m)^2=n(n-m)+fracm(m-1)2geq fracn(n-1)2,.$$
    There are therefore infinitely many nontrivial solutions when $ngeq2$.






    share|cite|improve this answer























    • Then can we conclude that all the solution are the projection matrix onto span(u,v)?
      – gimusi
      Jul 30 at 21:50










    • This is great, and would generalize nicely to any number of rank-1 matrices. I am also curious what other solutions could exist.
      – vibe
      Jul 30 at 21:59










    • Interesting solution. Since in this case the $u_j$ are orthogonal, I guess you can choose $V = I$ to simplify the solution a little.
      – vibe
      Jul 31 at 17:26










    • Also in the case $m = 1$, we know that $X = u u^T / (u^T u)$ is a solution, but this would not necessarily have $0$ in the upper right block of the matrix?
      – vibe
      Jul 31 at 17:29










    • You can't always choose $V=I$ even when the $u_j$'s are orthogonal. Work on the small cases to see that $V=I$ doesn't usually work.
      – Batominovski
      Jul 31 at 17:32












    up vote
    1
    down vote



    accepted







    up vote
    1
    down vote



    accepted






    Set
    $$A:=fracu,u^topu^top u+fracv,v^topv^top v,.$$
    Because $u^top ,v=0$ and $v^top,u=0$, as $u$ and $v$ are orthogonal, we conclude that $X=A$ is a solution to
    $$X^top,left(u,u^top+v,v^topright),X=u,u^top+v,v^top,.$$



    If $N$ is an $n$-by-$n$ matrix such that $u$ and $v$ are in $ker(N)$, then $X=A+N^top$ is also a solution. For $ngeq 3$, there are infinitely many such $N$. Furthermore, $X=I+N^top$ is also a nontrivial solution.




    For a fixed positive integer $mleq n$, let $u_1,u_2,ldots,u_m$ be (not necessarily orthogonal) linearly independent elements of $mathbbR^n$. Write $$J:=sum_j=1^m,u_j,u_j^top,.$$
    I shall find all solutions $XintextMat_ntimes n(mathbbR)$ such that
    $$X^top , J ,X=J,.$$



    Let $e_1,e_2,ldots,e_n$ be the standard basis vectors of $mathbbR^n$. Choose an invertible matrix $VintextGL_n(mathbbR)$ such that $V,u_j=e_j$ for $j=1,2,ldots,m$. Define the matrix $K$ to be $$K:=sum_j=1^m,e_j,e_j^top=V,left(sum_j=1^m,u_j,u_j^topright),V^top=V,J,V^top,.$$
    Setting $Y:=V^-1,X,V^top$, we see that $X^top,J,X=J$ if and only if $$Y^top,K,Y=K,.$$



    Write
    $$Y=beginbmatrixP&Q\R&Sendbmatrix,,$$
    where $PintextMat_mtimes m(mathbbR)$, $QintextMat_mtimes (n-m)(mathbbR)$, $RintextMat_(n-m)times m(mathbbR)$, and $SintextMat_(n-m)times(n-m)(mathbbR)$. Also, $$K=beginbmatrixI_mtimes m&0_mtimes (n-m)\0_(n-m)times m&0_mtimes mendbmatrix,,$$
    where $I_ktimes k$ is the $k$-by-$k$ identity matrix and $0_rtimes s$ is the $r$-by-$s$ zero matrix.
    Thus, the condition $Y^top,K,Y=K$ means that
    $$beginalign
    beginbmatrixI_mtimes m&0_mtimes (n-m)\0_(n-m)times m&0_mtimes mendbmatrix
    &=K=Y^top,K,Y
    \
    &=
    beginbmatrix P^top& R^top\Q^top &S^topendbmatrixbeginbmatrixI_mtimes m&0_mtimes (n-m)\0_(n-m)times m&0_mtimes mendbmatrixbeginbmatrixP&Q\ R&Sendbmatrix
    \
    &=beginbmatrix P^top& R^top\Q^top &S^topendbmatrixbeginbmatrix P&Q\ 0_(n-m)times m&0_mtimes mendbmatrix=beginbmatrixP^top, P&P^top, Q\ Q^top, P &Q^top ,Qendbmatrix
    endalign,.$$
    That is, $Q^top, Q=0_(n-m)times (n-m)$. This shows that $Q=0_(n-m)times (n-m)$. Furthermore, we have
    $$P^top,P =I_mtimes mtext or Pin O_m(mathbbR),.$$
    There are no other conditions on $R$ and $S$.



    Since $X=V,Y,left(V^-1right)^top$, we conclude that all solutions $X$ to $X^top,J,X=J$ take the form
    $$V,beginbmatrixP&0_mtimes(n-m)\R&Sendbmatrix,left(V^-1right)^top,,$$
    where $Pin O_m(mathbbR)$ is a real orthogonal matrix, $RintextMat_(n-m)times n(mathbbR)$, and $SintextMat_(n-m)times (n-m)(mathbbR)$.

    The solution set is therefore a smooth real manifold, which is diffeomorphic to
    $$O_m(mathbbR)times mathbbR^m(n-m)+(n-m)^2,.$$
    Hence, the solution set is of dimension $$fracm(m-1)2+m(n-m)+(n-m)^2=n(n-m)+fracm(m-1)2geq fracn(n-1)2,.$$
    There are therefore infinitely many nontrivial solutions when $ngeq2$.






    share|cite|improve this answer















    Set
    $$A:=fracu,u^topu^top u+fracv,v^topv^top v,.$$
    Because $u^top ,v=0$ and $v^top,u=0$, as $u$ and $v$ are orthogonal, we conclude that $X=A$ is a solution to
    $$X^top,left(u,u^top+v,v^topright),X=u,u^top+v,v^top,.$$



    If $N$ is an $n$-by-$n$ matrix such that $u$ and $v$ are in $ker(N)$, then $X=A+N^top$ is also a solution. For $ngeq 3$, there are infinitely many such $N$. Furthermore, $X=I+N^top$ is also a nontrivial solution.




    For a fixed positive integer $mleq n$, let $u_1,u_2,ldots,u_m$ be (not necessarily orthogonal) linearly independent elements of $mathbbR^n$. Write $$J:=sum_j=1^m,u_j,u_j^top,.$$
    I shall find all solutions $XintextMat_ntimes n(mathbbR)$ such that
    $$X^top , J ,X=J,.$$



    Let $e_1,e_2,ldots,e_n$ be the standard basis vectors of $mathbbR^n$. Choose an invertible matrix $VintextGL_n(mathbbR)$ such that $V,u_j=e_j$ for $j=1,2,ldots,m$. Define the matrix $K$ to be $$K:=sum_j=1^m,e_j,e_j^top=V,left(sum_j=1^m,u_j,u_j^topright),V^top=V,J,V^top,.$$
    Setting $Y:=V^-1,X,V^top$, we see that $X^top,J,X=J$ if and only if $$Y^top,K,Y=K,.$$



    Write
    $$Y=beginbmatrixP&Q\R&Sendbmatrix,,$$
    where $PintextMat_mtimes m(mathbbR)$, $QintextMat_mtimes (n-m)(mathbbR)$, $RintextMat_(n-m)times m(mathbbR)$, and $SintextMat_(n-m)times(n-m)(mathbbR)$. Also, $$K=beginbmatrixI_mtimes m&0_mtimes (n-m)\0_(n-m)times m&0_mtimes mendbmatrix,,$$
    where $I_ktimes k$ is the $k$-by-$k$ identity matrix and $0_rtimes s$ is the $r$-by-$s$ zero matrix.
    Thus, the condition $Y^top,K,Y=K$ means that
    $$beginalign
    beginbmatrixI_mtimes m&0_mtimes (n-m)\0_(n-m)times m&0_mtimes mendbmatrix
    &=K=Y^top,K,Y
    \
    &=
    beginbmatrix P^top& R^top\Q^top &S^topendbmatrixbeginbmatrixI_mtimes m&0_mtimes (n-m)\0_(n-m)times m&0_mtimes mendbmatrixbeginbmatrixP&Q\ R&Sendbmatrix
    \
    &=beginbmatrix P^top& R^top\Q^top &S^topendbmatrixbeginbmatrix P&Q\ 0_(n-m)times m&0_mtimes mendbmatrix=beginbmatrixP^top, P&P^top, Q\ Q^top, P &Q^top ,Qendbmatrix
    endalign,.$$
    That is, $Q^top, Q=0_(n-m)times (n-m)$. This shows that $Q=0_(n-m)times (n-m)$. Furthermore, we have
    $$P^top,P =I_mtimes mtext or Pin O_m(mathbbR),.$$
    There are no other conditions on $R$ and $S$.



    Since $X=V,Y,left(V^-1right)^top$, we conclude that all solutions $X$ to $X^top,J,X=J$ take the form
    $$V,beginbmatrixP&0_mtimes(n-m)\R&Sendbmatrix,left(V^-1right)^top,,$$
    where $Pin O_m(mathbbR)$ is a real orthogonal matrix, $RintextMat_(n-m)times n(mathbbR)$, and $SintextMat_(n-m)times (n-m)(mathbbR)$.

    The solution set is therefore a smooth real manifold, which is diffeomorphic to
    $$O_m(mathbbR)times mathbbR^m(n-m)+(n-m)^2,.$$
    Hence, the solution set is of dimension $$fracm(m-1)2+m(n-m)+(n-m)^2=n(n-m)+fracm(m-1)2geq fracn(n-1)2,.$$
    There are therefore infinitely many nontrivial solutions when $ngeq2$.







    share|cite|improve this answer















    share|cite|improve this answer



    share|cite|improve this answer








    edited Jul 31 at 7:13


























    answered Jul 30 at 21:43









    Batominovski

    22.8k22776




    22.8k22776











    • Then can we conclude that all the solution are the projection matrix onto span(u,v)?
      – gimusi
      Jul 30 at 21:50










    • This is great, and would generalize nicely to any number of rank-1 matrices. I am also curious what other solutions could exist.
      – vibe
      Jul 30 at 21:59










    • Interesting solution. Since in this case the $u_j$ are orthogonal, I guess you can choose $V = I$ to simplify the solution a little.
      – vibe
      Jul 31 at 17:26










    • Also in the case $m = 1$, we know that $X = u u^T / (u^T u)$ is a solution, but this would not necessarily have $0$ in the upper right block of the matrix?
      – vibe
      Jul 31 at 17:29










    • You can't always choose $V=I$ even when the $u_j$'s are orthogonal. Work on the small cases to see that $V=I$ doesn't usually work.
      – Batominovski
      Jul 31 at 17:32
















    • Then can we conclude that all the solution are the projection matrix onto span(u,v)?
      – gimusi
      Jul 30 at 21:50










    • This is great, and would generalize nicely to any number of rank-1 matrices. I am also curious what other solutions could exist.
      – vibe
      Jul 30 at 21:59










    • Interesting solution. Since in this case the $u_j$ are orthogonal, I guess you can choose $V = I$ to simplify the solution a little.
      – vibe
      Jul 31 at 17:26










    • Also in the case $m = 1$, we know that $X = u u^T / (u^T u)$ is a solution, but this would not necessarily have $0$ in the upper right block of the matrix?
      – vibe
      Jul 31 at 17:29










    • You can't always choose $V=I$ even when the $u_j$'s are orthogonal. Work on the small cases to see that $V=I$ doesn't usually work.
      – Batominovski
      Jul 31 at 17:32















    Then can we conclude that all the solution are the projection matrix onto span(u,v)?
    – gimusi
    Jul 30 at 21:50




    Then can we conclude that all the solution are the projection matrix onto span(u,v)?
    – gimusi
    Jul 30 at 21:50












    This is great, and would generalize nicely to any number of rank-1 matrices. I am also curious what other solutions could exist.
    – vibe
    Jul 30 at 21:59




    This is great, and would generalize nicely to any number of rank-1 matrices. I am also curious what other solutions could exist.
    – vibe
    Jul 30 at 21:59












    Interesting solution. Since in this case the $u_j$ are orthogonal, I guess you can choose $V = I$ to simplify the solution a little.
    – vibe
    Jul 31 at 17:26




    Interesting solution. Since in this case the $u_j$ are orthogonal, I guess you can choose $V = I$ to simplify the solution a little.
    – vibe
    Jul 31 at 17:26












    Also in the case $m = 1$, we know that $X = u u^T / (u^T u)$ is a solution, but this would not necessarily have $0$ in the upper right block of the matrix?
    – vibe
    Jul 31 at 17:29




    Also in the case $m = 1$, we know that $X = u u^T / (u^T u)$ is a solution, but this would not necessarily have $0$ in the upper right block of the matrix?
    – vibe
    Jul 31 at 17:29












    You can't always choose $V=I$ even when the $u_j$'s are orthogonal. Work on the small cases to see that $V=I$ doesn't usually work.
    – Batominovski
    Jul 31 at 17:32




    You can't always choose $V=I$ even when the $u_j$'s are orthogonal. Work on the small cases to see that $V=I$ doesn't usually work.
    – Batominovski
    Jul 31 at 17:32










    up vote
    0
    down vote













    With reference to



    $$X^T v v^T X = v v^T$$



    assuming $X^T=fracvv^T$ that is a projection matrix onto $operatornamespan(v)$ we have



    $$X^T v v^T X = fracvv^Tv v^Tfracvv^T=vv^T$$



    For



    $$X^T left( u u^T + v v^T right) X = u u^T + v v^T$$



    we can assume $X^T$ as a projection matrix onto $operatornamespan(v,u)$.






    share|cite|improve this answer



























      up vote
      0
      down vote













      With reference to



      $$X^T v v^T X = v v^T$$



      assuming $X^T=fracvv^T$ that is a projection matrix onto $operatornamespan(v)$ we have



      $$X^T v v^T X = fracvv^Tv v^Tfracvv^T=vv^T$$



      For



      $$X^T left( u u^T + v v^T right) X = u u^T + v v^T$$



      we can assume $X^T$ as a projection matrix onto $operatornamespan(v,u)$.






      share|cite|improve this answer

























        up vote
        0
        down vote










        up vote
        0
        down vote









        With reference to



        $$X^T v v^T X = v v^T$$



        assuming $X^T=fracvv^T$ that is a projection matrix onto $operatornamespan(v)$ we have



        $$X^T v v^T X = fracvv^Tv v^Tfracvv^T=vv^T$$



        For



        $$X^T left( u u^T + v v^T right) X = u u^T + v v^T$$



        we can assume $X^T$ as a projection matrix onto $operatornamespan(v,u)$.






        share|cite|improve this answer















        With reference to



        $$X^T v v^T X = v v^T$$



        assuming $X^T=fracvv^T$ that is a projection matrix onto $operatornamespan(v)$ we have



        $$X^T v v^T X = fracvv^Tv v^Tfracvv^T=vv^T$$



        For



        $$X^T left( u u^T + v v^T right) X = u u^T + v v^T$$



        we can assume $X^T$ as a projection matrix onto $operatornamespan(v,u)$.







        share|cite|improve this answer















        share|cite|improve this answer



        share|cite|improve this answer








        edited Jul 30 at 21:50


























        answered Jul 30 at 21:36









        gimusi

        64.1k73480




        64.1k73480






















             

            draft saved


            draft discarded


























             


            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2867435%2flinear-algebra-question-with-rank-1-matrices%23new-answer', 'question_page');

            );

            Post as a guest













































































            Comments

            Popular posts from this blog

            What is the equation of a 3D cone with generalised tilt?

            Color the edges and diagonals of a regular polygon

            Relationship between determinant of matrix and determinant of adjoint?