Inequality about the linear transformation of vectors

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
2
down vote

favorite
1












Let $mathbfA$ be a full column-rank matrix with unit $ell_2$-norm columns, and let $mathbfv_1, cdots, mathbfv_r$ be the vectors such that $ mathbfA mathbfv_1, cdots, mathbfA mathbfv_r $ is orthonormal. Also, let $c_1, cdots, c_r$ be real numbers satisfying $c_1^2+ cdots + c_r^2 = 1$.



In this case, does the following inequality hold true?
beginalign
underseti=1oversetrsum left| mathbfA^T mathbfA mathbfv_i right|_2^2
&ge fracr_2^2
endalign



Clearly, the above inequality holds for $r=1$, because



$$| mathbfA^T mathbfA mathbfv_1 |_2^2 cdot | mathbfv_1 |_2^2 ge langle mathbfA^T mathbfA mathbfv_1, mathbfv_1 rangle^2 = | mathbfA mathbfv_1 |_2^4 = 1.$$







share|cite|improve this question





















  • Is it a version of Cauchy-Schwarz?
    – Michael Burr
    Jul 28 at 2:43










  • I don’t know exactly. I just encounter this inequality when I solve some problems.
    – user580055
    Jul 28 at 8:59














up vote
2
down vote

favorite
1












Let $mathbfA$ be a full column-rank matrix with unit $ell_2$-norm columns, and let $mathbfv_1, cdots, mathbfv_r$ be the vectors such that $ mathbfA mathbfv_1, cdots, mathbfA mathbfv_r $ is orthonormal. Also, let $c_1, cdots, c_r$ be real numbers satisfying $c_1^2+ cdots + c_r^2 = 1$.



In this case, does the following inequality hold true?
beginalign
underseti=1oversetrsum left| mathbfA^T mathbfA mathbfv_i right|_2^2
&ge fracr_2^2
endalign



Clearly, the above inequality holds for $r=1$, because



$$| mathbfA^T mathbfA mathbfv_1 |_2^2 cdot | mathbfv_1 |_2^2 ge langle mathbfA^T mathbfA mathbfv_1, mathbfv_1 rangle^2 = | mathbfA mathbfv_1 |_2^4 = 1.$$







share|cite|improve this question





















  • Is it a version of Cauchy-Schwarz?
    – Michael Burr
    Jul 28 at 2:43










  • I don’t know exactly. I just encounter this inequality when I solve some problems.
    – user580055
    Jul 28 at 8:59












up vote
2
down vote

favorite
1









up vote
2
down vote

favorite
1






1





Let $mathbfA$ be a full column-rank matrix with unit $ell_2$-norm columns, and let $mathbfv_1, cdots, mathbfv_r$ be the vectors such that $ mathbfA mathbfv_1, cdots, mathbfA mathbfv_r $ is orthonormal. Also, let $c_1, cdots, c_r$ be real numbers satisfying $c_1^2+ cdots + c_r^2 = 1$.



In this case, does the following inequality hold true?
beginalign
underseti=1oversetrsum left| mathbfA^T mathbfA mathbfv_i right|_2^2
&ge fracr_2^2
endalign



Clearly, the above inequality holds for $r=1$, because



$$| mathbfA^T mathbfA mathbfv_1 |_2^2 cdot | mathbfv_1 |_2^2 ge langle mathbfA^T mathbfA mathbfv_1, mathbfv_1 rangle^2 = | mathbfA mathbfv_1 |_2^4 = 1.$$







share|cite|improve this question













Let $mathbfA$ be a full column-rank matrix with unit $ell_2$-norm columns, and let $mathbfv_1, cdots, mathbfv_r$ be the vectors such that $ mathbfA mathbfv_1, cdots, mathbfA mathbfv_r $ is orthonormal. Also, let $c_1, cdots, c_r$ be real numbers satisfying $c_1^2+ cdots + c_r^2 = 1$.



In this case, does the following inequality hold true?
beginalign
underseti=1oversetrsum left| mathbfA^T mathbfA mathbfv_i right|_2^2
&ge fracr_2^2
endalign



Clearly, the above inequality holds for $r=1$, because



$$| mathbfA^T mathbfA mathbfv_1 |_2^2 cdot | mathbfv_1 |_2^2 ge langle mathbfA^T mathbfA mathbfv_1, mathbfv_1 rangle^2 = | mathbfA mathbfv_1 |_2^4 = 1.$$









share|cite|improve this question












share|cite|improve this question




share|cite|improve this question








edited Jul 28 at 8:58
























asked Jul 28 at 1:22









user580055

213




213











  • Is it a version of Cauchy-Schwarz?
    – Michael Burr
    Jul 28 at 2:43










  • I don’t know exactly. I just encounter this inequality when I solve some problems.
    – user580055
    Jul 28 at 8:59
















  • Is it a version of Cauchy-Schwarz?
    – Michael Burr
    Jul 28 at 2:43










  • I don’t know exactly. I just encounter this inequality when I solve some problems.
    – user580055
    Jul 28 at 8:59















Is it a version of Cauchy-Schwarz?
– Michael Burr
Jul 28 at 2:43




Is it a version of Cauchy-Schwarz?
– Michael Burr
Jul 28 at 2:43












I don’t know exactly. I just encounter this inequality when I solve some problems.
– user580055
Jul 28 at 8:59




I don’t know exactly. I just encounter this inequality when I solve some problems.
– user580055
Jul 28 at 8:59










1 Answer
1






active

oldest

votes

















up vote
1
down vote













No. Take $A = beginbmatrix1 & 0\0 & epsilonendbmatrix$ and $V = A^-1$.



Then
$$sum_i=1^2 |A^TAmathbfv_i|^2 = 1 + epsilon^2$$
but we can set $c_0=1, c_1=0$ to get
$$fracr^2=2$$
violating the inequality for $epsilon$ sufficiently small.



What I believe is true is
$$sum_i=1^r left|A^TAmathbfv_iright|^2 geq frac1^2.$$



This is easy to see when $r$ is equal to the rank of $A$, since in that case the left-hand side is equal to $operatornametr(AA^T)$ and the right-hand side is bounded above by $lambda_mathrmmax(AA^T)/r.$ When $r$ is less than the rank of $A$, I believe the same argument will work, this time restricting $AA^T$ to the subspace spanned by the $Av_i$.






share|cite|improve this answer





















  • Thanks for your counterexample! How do you think about the scenrario where the columns of A are l2-normalized?
    – user580055
    Jul 28 at 8:54










  • @user580055 it doesn’t actually change anything. Try any arbitrary matrix with unequal eigenvalues.
    – user7530
    Jul 28 at 16:10










Your Answer




StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: false,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);








 

draft saved


draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2864906%2finequality-about-the-linear-transformation-of-vectors%23new-answer', 'question_page');

);

Post as a guest






























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes








up vote
1
down vote













No. Take $A = beginbmatrix1 & 0\0 & epsilonendbmatrix$ and $V = A^-1$.



Then
$$sum_i=1^2 |A^TAmathbfv_i|^2 = 1 + epsilon^2$$
but we can set $c_0=1, c_1=0$ to get
$$fracr^2=2$$
violating the inequality for $epsilon$ sufficiently small.



What I believe is true is
$$sum_i=1^r left|A^TAmathbfv_iright|^2 geq frac1^2.$$



This is easy to see when $r$ is equal to the rank of $A$, since in that case the left-hand side is equal to $operatornametr(AA^T)$ and the right-hand side is bounded above by $lambda_mathrmmax(AA^T)/r.$ When $r$ is less than the rank of $A$, I believe the same argument will work, this time restricting $AA^T$ to the subspace spanned by the $Av_i$.






share|cite|improve this answer





















  • Thanks for your counterexample! How do you think about the scenrario where the columns of A are l2-normalized?
    – user580055
    Jul 28 at 8:54










  • @user580055 it doesn’t actually change anything. Try any arbitrary matrix with unequal eigenvalues.
    – user7530
    Jul 28 at 16:10














up vote
1
down vote













No. Take $A = beginbmatrix1 & 0\0 & epsilonendbmatrix$ and $V = A^-1$.



Then
$$sum_i=1^2 |A^TAmathbfv_i|^2 = 1 + epsilon^2$$
but we can set $c_0=1, c_1=0$ to get
$$fracr^2=2$$
violating the inequality for $epsilon$ sufficiently small.



What I believe is true is
$$sum_i=1^r left|A^TAmathbfv_iright|^2 geq frac1^2.$$



This is easy to see when $r$ is equal to the rank of $A$, since in that case the left-hand side is equal to $operatornametr(AA^T)$ and the right-hand side is bounded above by $lambda_mathrmmax(AA^T)/r.$ When $r$ is less than the rank of $A$, I believe the same argument will work, this time restricting $AA^T$ to the subspace spanned by the $Av_i$.






share|cite|improve this answer





















  • Thanks for your counterexample! How do you think about the scenrario where the columns of A are l2-normalized?
    – user580055
    Jul 28 at 8:54










  • @user580055 it doesn’t actually change anything. Try any arbitrary matrix with unequal eigenvalues.
    – user7530
    Jul 28 at 16:10












up vote
1
down vote










up vote
1
down vote









No. Take $A = beginbmatrix1 & 0\0 & epsilonendbmatrix$ and $V = A^-1$.



Then
$$sum_i=1^2 |A^TAmathbfv_i|^2 = 1 + epsilon^2$$
but we can set $c_0=1, c_1=0$ to get
$$fracr^2=2$$
violating the inequality for $epsilon$ sufficiently small.



What I believe is true is
$$sum_i=1^r left|A^TAmathbfv_iright|^2 geq frac1^2.$$



This is easy to see when $r$ is equal to the rank of $A$, since in that case the left-hand side is equal to $operatornametr(AA^T)$ and the right-hand side is bounded above by $lambda_mathrmmax(AA^T)/r.$ When $r$ is less than the rank of $A$, I believe the same argument will work, this time restricting $AA^T$ to the subspace spanned by the $Av_i$.






share|cite|improve this answer













No. Take $A = beginbmatrix1 & 0\0 & epsilonendbmatrix$ and $V = A^-1$.



Then
$$sum_i=1^2 |A^TAmathbfv_i|^2 = 1 + epsilon^2$$
but we can set $c_0=1, c_1=0$ to get
$$fracr^2=2$$
violating the inequality for $epsilon$ sufficiently small.



What I believe is true is
$$sum_i=1^r left|A^TAmathbfv_iright|^2 geq frac1^2.$$



This is easy to see when $r$ is equal to the rank of $A$, since in that case the left-hand side is equal to $operatornametr(AA^T)$ and the right-hand side is bounded above by $lambda_mathrmmax(AA^T)/r.$ When $r$ is less than the rank of $A$, I believe the same argument will work, this time restricting $AA^T$ to the subspace spanned by the $Av_i$.







share|cite|improve this answer













share|cite|improve this answer



share|cite|improve this answer











answered Jul 28 at 8:00









user7530

33.3k558109




33.3k558109











  • Thanks for your counterexample! How do you think about the scenrario where the columns of A are l2-normalized?
    – user580055
    Jul 28 at 8:54










  • @user580055 it doesn’t actually change anything. Try any arbitrary matrix with unequal eigenvalues.
    – user7530
    Jul 28 at 16:10
















  • Thanks for your counterexample! How do you think about the scenrario where the columns of A are l2-normalized?
    – user580055
    Jul 28 at 8:54










  • @user580055 it doesn’t actually change anything. Try any arbitrary matrix with unequal eigenvalues.
    – user7530
    Jul 28 at 16:10















Thanks for your counterexample! How do you think about the scenrario where the columns of A are l2-normalized?
– user580055
Jul 28 at 8:54




Thanks for your counterexample! How do you think about the scenrario where the columns of A are l2-normalized?
– user580055
Jul 28 at 8:54












@user580055 it doesn’t actually change anything. Try any arbitrary matrix with unequal eigenvalues.
– user7530
Jul 28 at 16:10




@user580055 it doesn’t actually change anything. Try any arbitrary matrix with unequal eigenvalues.
– user7530
Jul 28 at 16:10












 

draft saved


draft discarded


























 


draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2864906%2finequality-about-the-linear-transformation-of-vectors%23new-answer', 'question_page');

);

Post as a guest













































































Comments

Popular posts from this blog

What is the equation of a 3D cone with generalised tilt?

Color the edges and diagonals of a regular polygon

Relationship between determinant of matrix and determinant of adjoint?