Inequality about the linear transformation of vectors
Clash Royale CLAN TAG#URR8PPP
up vote
2
down vote
favorite
Let $mathbfA$ be a full column-rank matrix with unit $ell_2$-norm columns, and let $mathbfv_1, cdots, mathbfv_r$ be the vectors such that $ mathbfA mathbfv_1, cdots, mathbfA mathbfv_r $ is orthonormal. Also, let $c_1, cdots, c_r$ be real numbers satisfying $c_1^2+ cdots + c_r^2 = 1$.
In this case, does the following inequality hold true?
beginalign
underseti=1oversetrsum left| mathbfA^T mathbfA mathbfv_i right|_2^2
&ge fracr_2^2
endalign
Clearly, the above inequality holds for $r=1$, because
$$| mathbfA^T mathbfA mathbfv_1 |_2^2 cdot | mathbfv_1 |_2^2 ge langle mathbfA^T mathbfA mathbfv_1, mathbfv_1 rangle^2 = | mathbfA mathbfv_1 |_2^4 = 1.$$
linear-algebra eigenvalues-eigenvectors
add a comment |Â
up vote
2
down vote
favorite
Let $mathbfA$ be a full column-rank matrix with unit $ell_2$-norm columns, and let $mathbfv_1, cdots, mathbfv_r$ be the vectors such that $ mathbfA mathbfv_1, cdots, mathbfA mathbfv_r $ is orthonormal. Also, let $c_1, cdots, c_r$ be real numbers satisfying $c_1^2+ cdots + c_r^2 = 1$.
In this case, does the following inequality hold true?
beginalign
underseti=1oversetrsum left| mathbfA^T mathbfA mathbfv_i right|_2^2
&ge fracr_2^2
endalign
Clearly, the above inequality holds for $r=1$, because
$$| mathbfA^T mathbfA mathbfv_1 |_2^2 cdot | mathbfv_1 |_2^2 ge langle mathbfA^T mathbfA mathbfv_1, mathbfv_1 rangle^2 = | mathbfA mathbfv_1 |_2^4 = 1.$$
linear-algebra eigenvalues-eigenvectors
Is it a version of Cauchy-Schwarz?
– Michael Burr
Jul 28 at 2:43
I don’t know exactly. I just encounter this inequality when I solve some problems.
– user580055
Jul 28 at 8:59
add a comment |Â
up vote
2
down vote
favorite
up vote
2
down vote
favorite
Let $mathbfA$ be a full column-rank matrix with unit $ell_2$-norm columns, and let $mathbfv_1, cdots, mathbfv_r$ be the vectors such that $ mathbfA mathbfv_1, cdots, mathbfA mathbfv_r $ is orthonormal. Also, let $c_1, cdots, c_r$ be real numbers satisfying $c_1^2+ cdots + c_r^2 = 1$.
In this case, does the following inequality hold true?
beginalign
underseti=1oversetrsum left| mathbfA^T mathbfA mathbfv_i right|_2^2
&ge fracr_2^2
endalign
Clearly, the above inequality holds for $r=1$, because
$$| mathbfA^T mathbfA mathbfv_1 |_2^2 cdot | mathbfv_1 |_2^2 ge langle mathbfA^T mathbfA mathbfv_1, mathbfv_1 rangle^2 = | mathbfA mathbfv_1 |_2^4 = 1.$$
linear-algebra eigenvalues-eigenvectors
Let $mathbfA$ be a full column-rank matrix with unit $ell_2$-norm columns, and let $mathbfv_1, cdots, mathbfv_r$ be the vectors such that $ mathbfA mathbfv_1, cdots, mathbfA mathbfv_r $ is orthonormal. Also, let $c_1, cdots, c_r$ be real numbers satisfying $c_1^2+ cdots + c_r^2 = 1$.
In this case, does the following inequality hold true?
beginalign
underseti=1oversetrsum left| mathbfA^T mathbfA mathbfv_i right|_2^2
&ge fracr_2^2
endalign
Clearly, the above inequality holds for $r=1$, because
$$| mathbfA^T mathbfA mathbfv_1 |_2^2 cdot | mathbfv_1 |_2^2 ge langle mathbfA^T mathbfA mathbfv_1, mathbfv_1 rangle^2 = | mathbfA mathbfv_1 |_2^4 = 1.$$
linear-algebra eigenvalues-eigenvectors
edited Jul 28 at 8:58
asked Jul 28 at 1:22


user580055
213
213
Is it a version of Cauchy-Schwarz?
– Michael Burr
Jul 28 at 2:43
I don’t know exactly. I just encounter this inequality when I solve some problems.
– user580055
Jul 28 at 8:59
add a comment |Â
Is it a version of Cauchy-Schwarz?
– Michael Burr
Jul 28 at 2:43
I don’t know exactly. I just encounter this inequality when I solve some problems.
– user580055
Jul 28 at 8:59
Is it a version of Cauchy-Schwarz?
– Michael Burr
Jul 28 at 2:43
Is it a version of Cauchy-Schwarz?
– Michael Burr
Jul 28 at 2:43
I don’t know exactly. I just encounter this inequality when I solve some problems.
– user580055
Jul 28 at 8:59
I don’t know exactly. I just encounter this inequality when I solve some problems.
– user580055
Jul 28 at 8:59
add a comment |Â
1 Answer
1
active
oldest
votes
up vote
1
down vote
No. Take $A = beginbmatrix1 & 0\0 & epsilonendbmatrix$ and $V = A^-1$.
Then
$$sum_i=1^2 |A^TAmathbfv_i|^2 = 1 + epsilon^2$$
but we can set $c_0=1, c_1=0$ to get
$$fracr^2=2$$
violating the inequality for $epsilon$ sufficiently small.
What I believe is true is
$$sum_i=1^r left|A^TAmathbfv_iright|^2 geq frac1^2.$$
This is easy to see when $r$ is equal to the rank of $A$, since in that case the left-hand side is equal to $operatornametr(AA^T)$ and the right-hand side is bounded above by $lambda_mathrmmax(AA^T)/r.$ When $r$ is less than the rank of $A$, I believe the same argument will work, this time restricting $AA^T$ to the subspace spanned by the $Av_i$.
Thanks for your counterexample! How do you think about the scenrario where the columns of A are l2-normalized?
– user580055
Jul 28 at 8:54
@user580055 it doesn’t actually change anything. Try any arbitrary matrix with unequal eigenvalues.
– user7530
Jul 28 at 16:10
add a comment |Â
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
No. Take $A = beginbmatrix1 & 0\0 & epsilonendbmatrix$ and $V = A^-1$.
Then
$$sum_i=1^2 |A^TAmathbfv_i|^2 = 1 + epsilon^2$$
but we can set $c_0=1, c_1=0$ to get
$$fracr^2=2$$
violating the inequality for $epsilon$ sufficiently small.
What I believe is true is
$$sum_i=1^r left|A^TAmathbfv_iright|^2 geq frac1^2.$$
This is easy to see when $r$ is equal to the rank of $A$, since in that case the left-hand side is equal to $operatornametr(AA^T)$ and the right-hand side is bounded above by $lambda_mathrmmax(AA^T)/r.$ When $r$ is less than the rank of $A$, I believe the same argument will work, this time restricting $AA^T$ to the subspace spanned by the $Av_i$.
Thanks for your counterexample! How do you think about the scenrario where the columns of A are l2-normalized?
– user580055
Jul 28 at 8:54
@user580055 it doesn’t actually change anything. Try any arbitrary matrix with unequal eigenvalues.
– user7530
Jul 28 at 16:10
add a comment |Â
up vote
1
down vote
No. Take $A = beginbmatrix1 & 0\0 & epsilonendbmatrix$ and $V = A^-1$.
Then
$$sum_i=1^2 |A^TAmathbfv_i|^2 = 1 + epsilon^2$$
but we can set $c_0=1, c_1=0$ to get
$$fracr^2=2$$
violating the inequality for $epsilon$ sufficiently small.
What I believe is true is
$$sum_i=1^r left|A^TAmathbfv_iright|^2 geq frac1^2.$$
This is easy to see when $r$ is equal to the rank of $A$, since in that case the left-hand side is equal to $operatornametr(AA^T)$ and the right-hand side is bounded above by $lambda_mathrmmax(AA^T)/r.$ When $r$ is less than the rank of $A$, I believe the same argument will work, this time restricting $AA^T$ to the subspace spanned by the $Av_i$.
Thanks for your counterexample! How do you think about the scenrario where the columns of A are l2-normalized?
– user580055
Jul 28 at 8:54
@user580055 it doesn’t actually change anything. Try any arbitrary matrix with unequal eigenvalues.
– user7530
Jul 28 at 16:10
add a comment |Â
up vote
1
down vote
up vote
1
down vote
No. Take $A = beginbmatrix1 & 0\0 & epsilonendbmatrix$ and $V = A^-1$.
Then
$$sum_i=1^2 |A^TAmathbfv_i|^2 = 1 + epsilon^2$$
but we can set $c_0=1, c_1=0$ to get
$$fracr^2=2$$
violating the inequality for $epsilon$ sufficiently small.
What I believe is true is
$$sum_i=1^r left|A^TAmathbfv_iright|^2 geq frac1^2.$$
This is easy to see when $r$ is equal to the rank of $A$, since in that case the left-hand side is equal to $operatornametr(AA^T)$ and the right-hand side is bounded above by $lambda_mathrmmax(AA^T)/r.$ When $r$ is less than the rank of $A$, I believe the same argument will work, this time restricting $AA^T$ to the subspace spanned by the $Av_i$.
No. Take $A = beginbmatrix1 & 0\0 & epsilonendbmatrix$ and $V = A^-1$.
Then
$$sum_i=1^2 |A^TAmathbfv_i|^2 = 1 + epsilon^2$$
but we can set $c_0=1, c_1=0$ to get
$$fracr^2=2$$
violating the inequality for $epsilon$ sufficiently small.
What I believe is true is
$$sum_i=1^r left|A^TAmathbfv_iright|^2 geq frac1^2.$$
This is easy to see when $r$ is equal to the rank of $A$, since in that case the left-hand side is equal to $operatornametr(AA^T)$ and the right-hand side is bounded above by $lambda_mathrmmax(AA^T)/r.$ When $r$ is less than the rank of $A$, I believe the same argument will work, this time restricting $AA^T$ to the subspace spanned by the $Av_i$.
answered Jul 28 at 8:00
user7530
33.3k558109
33.3k558109
Thanks for your counterexample! How do you think about the scenrario where the columns of A are l2-normalized?
– user580055
Jul 28 at 8:54
@user580055 it doesn’t actually change anything. Try any arbitrary matrix with unequal eigenvalues.
– user7530
Jul 28 at 16:10
add a comment |Â
Thanks for your counterexample! How do you think about the scenrario where the columns of A are l2-normalized?
– user580055
Jul 28 at 8:54
@user580055 it doesn’t actually change anything. Try any arbitrary matrix with unequal eigenvalues.
– user7530
Jul 28 at 16:10
Thanks for your counterexample! How do you think about the scenrario where the columns of A are l2-normalized?
– user580055
Jul 28 at 8:54
Thanks for your counterexample! How do you think about the scenrario where the columns of A are l2-normalized?
– user580055
Jul 28 at 8:54
@user580055 it doesn’t actually change anything. Try any arbitrary matrix with unequal eigenvalues.
– user7530
Jul 28 at 16:10
@user580055 it doesn’t actually change anything. Try any arbitrary matrix with unequal eigenvalues.
– user7530
Jul 28 at 16:10
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2864906%2finequality-about-the-linear-transformation-of-vectors%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Is it a version of Cauchy-Schwarz?
– Michael Burr
Jul 28 at 2:43
I don’t know exactly. I just encounter this inequality when I solve some problems.
– user580055
Jul 28 at 8:59