Expected Proportion of Random Variables
Clash Royale CLAN TAG#URR8PPP
up vote
3
down vote
favorite
In the case that non-negative random variables $X_i$ are i.i.d we have $$mathbbEfracX_iX_1+dots+X_n = frac1n.$$ What can be said in the non-identical case? Specifically, if $X_igeq 0$ are independent (but not identically distributed), can we say that
$$mathbbEfracX_iX_1+dots+X_n$$ is close to
$$fracmathbbEX_imathbbEX_1 + dots + mathbbEX_n$$ in, say, absolute value (where this might depend on the variance of the $X_i$)? Note that if $X_isimtextGamma(alpha_i,1)$ this holds with equality.
probability convex-analysis expectation upper-lower-bounds
add a comment |Â
up vote
3
down vote
favorite
In the case that non-negative random variables $X_i$ are i.i.d we have $$mathbbEfracX_iX_1+dots+X_n = frac1n.$$ What can be said in the non-identical case? Specifically, if $X_igeq 0$ are independent (but not identically distributed), can we say that
$$mathbbEfracX_iX_1+dots+X_n$$ is close to
$$fracmathbbEX_imathbbEX_1 + dots + mathbbEX_n$$ in, say, absolute value (where this might depend on the variance of the $X_i$)? Note that if $X_isimtextGamma(alpha_i,1)$ this holds with equality.
probability convex-analysis expectation upper-lower-bounds
1
Are the random variables non-negative? I think even for the first case the equation doesn´t hold without the non-negativity.
– callculus
Jul 31 at 18:07
@callculus Yes, sorry I should've specified that.
– cdipaolo
Jul 31 at 18:09
add a comment |Â
up vote
3
down vote
favorite
up vote
3
down vote
favorite
In the case that non-negative random variables $X_i$ are i.i.d we have $$mathbbEfracX_iX_1+dots+X_n = frac1n.$$ What can be said in the non-identical case? Specifically, if $X_igeq 0$ are independent (but not identically distributed), can we say that
$$mathbbEfracX_iX_1+dots+X_n$$ is close to
$$fracmathbbEX_imathbbEX_1 + dots + mathbbEX_n$$ in, say, absolute value (where this might depend on the variance of the $X_i$)? Note that if $X_isimtextGamma(alpha_i,1)$ this holds with equality.
probability convex-analysis expectation upper-lower-bounds
In the case that non-negative random variables $X_i$ are i.i.d we have $$mathbbEfracX_iX_1+dots+X_n = frac1n.$$ What can be said in the non-identical case? Specifically, if $X_igeq 0$ are independent (but not identically distributed), can we say that
$$mathbbEfracX_iX_1+dots+X_n$$ is close to
$$fracmathbbEX_imathbbEX_1 + dots + mathbbEX_n$$ in, say, absolute value (where this might depend on the variance of the $X_i$)? Note that if $X_isimtextGamma(alpha_i,1)$ this holds with equality.
probability convex-analysis expectation upper-lower-bounds
edited Jul 31 at 18:14
asked Jul 31 at 17:37
cdipaolo
367110
367110
1
Are the random variables non-negative? I think even for the first case the equation doesn´t hold without the non-negativity.
– callculus
Jul 31 at 18:07
@callculus Yes, sorry I should've specified that.
– cdipaolo
Jul 31 at 18:09
add a comment |Â
1
Are the random variables non-negative? I think even for the first case the equation doesn´t hold without the non-negativity.
– callculus
Jul 31 at 18:07
@callculus Yes, sorry I should've specified that.
– cdipaolo
Jul 31 at 18:09
1
1
Are the random variables non-negative? I think even for the first case the equation doesn´t hold without the non-negativity.
– callculus
Jul 31 at 18:07
Are the random variables non-negative? I think even for the first case the equation doesn´t hold without the non-negativity.
– callculus
Jul 31 at 18:07
@callculus Yes, sorry I should've specified that.
– cdipaolo
Jul 31 at 18:09
@callculus Yes, sorry I should've specified that.
– cdipaolo
Jul 31 at 18:09
add a comment |Â
2 Answers
2
active
oldest
votes
up vote
3
down vote
For $epsilon,K>0$ let
$$
X_1 = begincases
2 & textwith prob. tfrac12,\
2epsilon & textwith prob. tfrac12,
endcasesqquad X_2 = begincases
K & textwith prob. tfrac1+epsilonK,\
0 & textotherwise.
endcases
$$
Then $mathbbE X_1 = mathbbE X_2 = 1+epsilon$ but
$$
biggl|mathbbEfracX_1X_1 + X_2 - fracmathbbEX_1mathbbEX_1 + mathbbEX_2biggr| geq bigl|tfrac12 - tfrac1+epsilonKbigr| to frac12
$$
as $Ktoinfty$.
2
This is a good negative result, however the variance of $X_2$ is at least $K-2$, which is very large. As such, this wouldn't rule out a bound based on the variance of the $X_i$.
– cdipaolo
Jul 31 at 18:39
1
Yes @cdipaolo the question still left open is how close the two quantities are, for the case where the variance of the rvs re similar/the same. unofrtunately I don't know at the moment.
– Mike
Jul 31 at 18:52
@ClementC. Compute $mathbbEtfracX_1X_1+X_2 geq mathbbP(X_2 = 0) = 1 - tfrac1+epsilonK$ which implies the bound if $Kgeq 2+2epsilon$.
– cdipaolo
Jul 31 at 20:04
1
@cdipaolo Sorry, I figured it out before your answer (and deleted my comment to minimize noise)
– Clement C.
Jul 31 at 20:18
add a comment |Â
up vote
0
down vote
accepted
$newcommandEmathbbE$Denote $sigma_i = E|X_i - E X_i|$, $X_-k = X_1 + dots + X_k-1 + X_k+1 + dots + X_n$, and assume $X_i>0$ almost surely. We can prove the following Efron-Stein-looking inequality, with nicer bounds if we are guaranteed $X_igeq m>0$ almost surely:
$$
biggl|Ebiggl[fracX_iX_1+dots+X_nbiggr] - fracE X_iE X_1 + dots + E X_nbiggr| leq sum_j=1^n Efracsigma_jX_-j leq frac1(n-1)msum_j=1^nsigma_j
$$
and a somewhat tighter $ell^1$ bound for the corresponding simplex vector
$$
biggl|Ebiggl[fracX_iX_1+dots+X_nbiggr] - fracE X_iE X_1 + dots + E X_nbiggr|_1 leq 2sum_j=1^n Efracsigma_jX_-j leq frac2(n-1)msum_j=1^nsigma_j
$$
Hopefully someone can make this bound in terms of $E X_j$ instead of the expected reciprocal so I won't accept this for a while.
Proof. Compute
$$
beginalign*
Ebiggl[fracX_iX_1+dots+X_nbiggr] - fracE X_iE X_1 + dots + E X_n &= Ebiggl[fracX_i(E X_1+dots + E X_n) - (X_1+dots+X_n)E X_i(X_1+dots+X_n)(E X_1 + dots + E X_n)biggr]\
&= Ebiggl[fracsum_jneq iX_iE X_j - X_jE X_i(X_1+dots+X_n)(E X_1 + dots + E X_n)biggr]\
&= frac1E X_1 + dots + E X_nsum_jneq i Ebiggl[fracX_iE X_j - X_j E X_i X_1 + dots + X_nbiggr].
endalign*
$$
Using Jensen we can bound
$$
beginalign*
biggl|EfracX_iE X_j - X_j E X_iX_1 + dots + X_nbiggr| &leq EfracX_1 + dots + X_n\
&leq EfracE X_j + X_1 + dots + X_n
endalign*
$$
and, by non-negativity and independence,
$$
beginalign*
EfracX_i - E X_iX_1 + dots + X_n &leq EfracX_i - E X_iX_1 + dots + X_i-1 + X_i+1 + dots + X_n\
&= EfracX_i - E X_iX_1 + dots + X_i-1 + X_i+1 + dots + X_n\
&= Efracsigma_i E X_jX_1 + dots + X_i-1 + X_i+1 + dots + X_n.
endalign*
$$
Hence
$$
beginalign*
biggl|Ebiggl[fracX_iX_1+dots+X_nbiggr] - fracE X_iE X_1 + dots + E X_nbiggr| leq Efracsigma_iX_-i + fracE X_iE X_1 + dots + E X_nsum_jneq i Efracsigma_jX_-jleq sum_j=1^n Efracsigma_jX_-j.
endalign*
$$
add a comment |Â
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
3
down vote
For $epsilon,K>0$ let
$$
X_1 = begincases
2 & textwith prob. tfrac12,\
2epsilon & textwith prob. tfrac12,
endcasesqquad X_2 = begincases
K & textwith prob. tfrac1+epsilonK,\
0 & textotherwise.
endcases
$$
Then $mathbbE X_1 = mathbbE X_2 = 1+epsilon$ but
$$
biggl|mathbbEfracX_1X_1 + X_2 - fracmathbbEX_1mathbbEX_1 + mathbbEX_2biggr| geq bigl|tfrac12 - tfrac1+epsilonKbigr| to frac12
$$
as $Ktoinfty$.
2
This is a good negative result, however the variance of $X_2$ is at least $K-2$, which is very large. As such, this wouldn't rule out a bound based on the variance of the $X_i$.
– cdipaolo
Jul 31 at 18:39
1
Yes @cdipaolo the question still left open is how close the two quantities are, for the case where the variance of the rvs re similar/the same. unofrtunately I don't know at the moment.
– Mike
Jul 31 at 18:52
@ClementC. Compute $mathbbEtfracX_1X_1+X_2 geq mathbbP(X_2 = 0) = 1 - tfrac1+epsilonK$ which implies the bound if $Kgeq 2+2epsilon$.
– cdipaolo
Jul 31 at 20:04
1
@cdipaolo Sorry, I figured it out before your answer (and deleted my comment to minimize noise)
– Clement C.
Jul 31 at 20:18
add a comment |Â
up vote
3
down vote
For $epsilon,K>0$ let
$$
X_1 = begincases
2 & textwith prob. tfrac12,\
2epsilon & textwith prob. tfrac12,
endcasesqquad X_2 = begincases
K & textwith prob. tfrac1+epsilonK,\
0 & textotherwise.
endcases
$$
Then $mathbbE X_1 = mathbbE X_2 = 1+epsilon$ but
$$
biggl|mathbbEfracX_1X_1 + X_2 - fracmathbbEX_1mathbbEX_1 + mathbbEX_2biggr| geq bigl|tfrac12 - tfrac1+epsilonKbigr| to frac12
$$
as $Ktoinfty$.
2
This is a good negative result, however the variance of $X_2$ is at least $K-2$, which is very large. As such, this wouldn't rule out a bound based on the variance of the $X_i$.
– cdipaolo
Jul 31 at 18:39
1
Yes @cdipaolo the question still left open is how close the two quantities are, for the case where the variance of the rvs re similar/the same. unofrtunately I don't know at the moment.
– Mike
Jul 31 at 18:52
@ClementC. Compute $mathbbEtfracX_1X_1+X_2 geq mathbbP(X_2 = 0) = 1 - tfrac1+epsilonK$ which implies the bound if $Kgeq 2+2epsilon$.
– cdipaolo
Jul 31 at 20:04
1
@cdipaolo Sorry, I figured it out before your answer (and deleted my comment to minimize noise)
– Clement C.
Jul 31 at 20:18
add a comment |Â
up vote
3
down vote
up vote
3
down vote
For $epsilon,K>0$ let
$$
X_1 = begincases
2 & textwith prob. tfrac12,\
2epsilon & textwith prob. tfrac12,
endcasesqquad X_2 = begincases
K & textwith prob. tfrac1+epsilonK,\
0 & textotherwise.
endcases
$$
Then $mathbbE X_1 = mathbbE X_2 = 1+epsilon$ but
$$
biggl|mathbbEfracX_1X_1 + X_2 - fracmathbbEX_1mathbbEX_1 + mathbbEX_2biggr| geq bigl|tfrac12 - tfrac1+epsilonKbigr| to frac12
$$
as $Ktoinfty$.
For $epsilon,K>0$ let
$$
X_1 = begincases
2 & textwith prob. tfrac12,\
2epsilon & textwith prob. tfrac12,
endcasesqquad X_2 = begincases
K & textwith prob. tfrac1+epsilonK,\
0 & textotherwise.
endcases
$$
Then $mathbbE X_1 = mathbbE X_2 = 1+epsilon$ but
$$
biggl|mathbbEfracX_1X_1 + X_2 - fracmathbbEX_1mathbbEX_1 + mathbbEX_2biggr| geq bigl|tfrac12 - tfrac1+epsilonKbigr| to frac12
$$
as $Ktoinfty$.
edited Jul 31 at 18:58
cdipaolo
367110
367110
answered Jul 31 at 18:14
Mike
1,663110
1,663110
2
This is a good negative result, however the variance of $X_2$ is at least $K-2$, which is very large. As such, this wouldn't rule out a bound based on the variance of the $X_i$.
– cdipaolo
Jul 31 at 18:39
1
Yes @cdipaolo the question still left open is how close the two quantities are, for the case where the variance of the rvs re similar/the same. unofrtunately I don't know at the moment.
– Mike
Jul 31 at 18:52
@ClementC. Compute $mathbbEtfracX_1X_1+X_2 geq mathbbP(X_2 = 0) = 1 - tfrac1+epsilonK$ which implies the bound if $Kgeq 2+2epsilon$.
– cdipaolo
Jul 31 at 20:04
1
@cdipaolo Sorry, I figured it out before your answer (and deleted my comment to minimize noise)
– Clement C.
Jul 31 at 20:18
add a comment |Â
2
This is a good negative result, however the variance of $X_2$ is at least $K-2$, which is very large. As such, this wouldn't rule out a bound based on the variance of the $X_i$.
– cdipaolo
Jul 31 at 18:39
1
Yes @cdipaolo the question still left open is how close the two quantities are, for the case where the variance of the rvs re similar/the same. unofrtunately I don't know at the moment.
– Mike
Jul 31 at 18:52
@ClementC. Compute $mathbbEtfracX_1X_1+X_2 geq mathbbP(X_2 = 0) = 1 - tfrac1+epsilonK$ which implies the bound if $Kgeq 2+2epsilon$.
– cdipaolo
Jul 31 at 20:04
1
@cdipaolo Sorry, I figured it out before your answer (and deleted my comment to minimize noise)
– Clement C.
Jul 31 at 20:18
2
2
This is a good negative result, however the variance of $X_2$ is at least $K-2$, which is very large. As such, this wouldn't rule out a bound based on the variance of the $X_i$.
– cdipaolo
Jul 31 at 18:39
This is a good negative result, however the variance of $X_2$ is at least $K-2$, which is very large. As such, this wouldn't rule out a bound based on the variance of the $X_i$.
– cdipaolo
Jul 31 at 18:39
1
1
Yes @cdipaolo the question still left open is how close the two quantities are, for the case where the variance of the rvs re similar/the same. unofrtunately I don't know at the moment.
– Mike
Jul 31 at 18:52
Yes @cdipaolo the question still left open is how close the two quantities are, for the case where the variance of the rvs re similar/the same. unofrtunately I don't know at the moment.
– Mike
Jul 31 at 18:52
@ClementC. Compute $mathbbEtfracX_1X_1+X_2 geq mathbbP(X_2 = 0) = 1 - tfrac1+epsilonK$ which implies the bound if $Kgeq 2+2epsilon$.
– cdipaolo
Jul 31 at 20:04
@ClementC. Compute $mathbbEtfracX_1X_1+X_2 geq mathbbP(X_2 = 0) = 1 - tfrac1+epsilonK$ which implies the bound if $Kgeq 2+2epsilon$.
– cdipaolo
Jul 31 at 20:04
1
1
@cdipaolo Sorry, I figured it out before your answer (and deleted my comment to minimize noise)
– Clement C.
Jul 31 at 20:18
@cdipaolo Sorry, I figured it out before your answer (and deleted my comment to minimize noise)
– Clement C.
Jul 31 at 20:18
add a comment |Â
up vote
0
down vote
accepted
$newcommandEmathbbE$Denote $sigma_i = E|X_i - E X_i|$, $X_-k = X_1 + dots + X_k-1 + X_k+1 + dots + X_n$, and assume $X_i>0$ almost surely. We can prove the following Efron-Stein-looking inequality, with nicer bounds if we are guaranteed $X_igeq m>0$ almost surely:
$$
biggl|Ebiggl[fracX_iX_1+dots+X_nbiggr] - fracE X_iE X_1 + dots + E X_nbiggr| leq sum_j=1^n Efracsigma_jX_-j leq frac1(n-1)msum_j=1^nsigma_j
$$
and a somewhat tighter $ell^1$ bound for the corresponding simplex vector
$$
biggl|Ebiggl[fracX_iX_1+dots+X_nbiggr] - fracE X_iE X_1 + dots + E X_nbiggr|_1 leq 2sum_j=1^n Efracsigma_jX_-j leq frac2(n-1)msum_j=1^nsigma_j
$$
Hopefully someone can make this bound in terms of $E X_j$ instead of the expected reciprocal so I won't accept this for a while.
Proof. Compute
$$
beginalign*
Ebiggl[fracX_iX_1+dots+X_nbiggr] - fracE X_iE X_1 + dots + E X_n &= Ebiggl[fracX_i(E X_1+dots + E X_n) - (X_1+dots+X_n)E X_i(X_1+dots+X_n)(E X_1 + dots + E X_n)biggr]\
&= Ebiggl[fracsum_jneq iX_iE X_j - X_jE X_i(X_1+dots+X_n)(E X_1 + dots + E X_n)biggr]\
&= frac1E X_1 + dots + E X_nsum_jneq i Ebiggl[fracX_iE X_j - X_j E X_i X_1 + dots + X_nbiggr].
endalign*
$$
Using Jensen we can bound
$$
beginalign*
biggl|EfracX_iE X_j - X_j E X_iX_1 + dots + X_nbiggr| &leq EfracX_1 + dots + X_n\
&leq EfracE X_j + X_1 + dots + X_n
endalign*
$$
and, by non-negativity and independence,
$$
beginalign*
EfracX_i - E X_iX_1 + dots + X_n &leq EfracX_i - E X_iX_1 + dots + X_i-1 + X_i+1 + dots + X_n\
&= EfracX_i - E X_iX_1 + dots + X_i-1 + X_i+1 + dots + X_n\
&= Efracsigma_i E X_jX_1 + dots + X_i-1 + X_i+1 + dots + X_n.
endalign*
$$
Hence
$$
beginalign*
biggl|Ebiggl[fracX_iX_1+dots+X_nbiggr] - fracE X_iE X_1 + dots + E X_nbiggr| leq Efracsigma_iX_-i + fracE X_iE X_1 + dots + E X_nsum_jneq i Efracsigma_jX_-jleq sum_j=1^n Efracsigma_jX_-j.
endalign*
$$
add a comment |Â
up vote
0
down vote
accepted
$newcommandEmathbbE$Denote $sigma_i = E|X_i - E X_i|$, $X_-k = X_1 + dots + X_k-1 + X_k+1 + dots + X_n$, and assume $X_i>0$ almost surely. We can prove the following Efron-Stein-looking inequality, with nicer bounds if we are guaranteed $X_igeq m>0$ almost surely:
$$
biggl|Ebiggl[fracX_iX_1+dots+X_nbiggr] - fracE X_iE X_1 + dots + E X_nbiggr| leq sum_j=1^n Efracsigma_jX_-j leq frac1(n-1)msum_j=1^nsigma_j
$$
and a somewhat tighter $ell^1$ bound for the corresponding simplex vector
$$
biggl|Ebiggl[fracX_iX_1+dots+X_nbiggr] - fracE X_iE X_1 + dots + E X_nbiggr|_1 leq 2sum_j=1^n Efracsigma_jX_-j leq frac2(n-1)msum_j=1^nsigma_j
$$
Hopefully someone can make this bound in terms of $E X_j$ instead of the expected reciprocal so I won't accept this for a while.
Proof. Compute
$$
beginalign*
Ebiggl[fracX_iX_1+dots+X_nbiggr] - fracE X_iE X_1 + dots + E X_n &= Ebiggl[fracX_i(E X_1+dots + E X_n) - (X_1+dots+X_n)E X_i(X_1+dots+X_n)(E X_1 + dots + E X_n)biggr]\
&= Ebiggl[fracsum_jneq iX_iE X_j - X_jE X_i(X_1+dots+X_n)(E X_1 + dots + E X_n)biggr]\
&= frac1E X_1 + dots + E X_nsum_jneq i Ebiggl[fracX_iE X_j - X_j E X_i X_1 + dots + X_nbiggr].
endalign*
$$
Using Jensen we can bound
$$
beginalign*
biggl|EfracX_iE X_j - X_j E X_iX_1 + dots + X_nbiggr| &leq EfracX_1 + dots + X_n\
&leq EfracE X_j + X_1 + dots + X_n
endalign*
$$
and, by non-negativity and independence,
$$
beginalign*
EfracX_i - E X_iX_1 + dots + X_n &leq EfracX_i - E X_iX_1 + dots + X_i-1 + X_i+1 + dots + X_n\
&= EfracX_i - E X_iX_1 + dots + X_i-1 + X_i+1 + dots + X_n\
&= Efracsigma_i E X_jX_1 + dots + X_i-1 + X_i+1 + dots + X_n.
endalign*
$$
Hence
$$
beginalign*
biggl|Ebiggl[fracX_iX_1+dots+X_nbiggr] - fracE X_iE X_1 + dots + E X_nbiggr| leq Efracsigma_iX_-i + fracE X_iE X_1 + dots + E X_nsum_jneq i Efracsigma_jX_-jleq sum_j=1^n Efracsigma_jX_-j.
endalign*
$$
add a comment |Â
up vote
0
down vote
accepted
up vote
0
down vote
accepted
$newcommandEmathbbE$Denote $sigma_i = E|X_i - E X_i|$, $X_-k = X_1 + dots + X_k-1 + X_k+1 + dots + X_n$, and assume $X_i>0$ almost surely. We can prove the following Efron-Stein-looking inequality, with nicer bounds if we are guaranteed $X_igeq m>0$ almost surely:
$$
biggl|Ebiggl[fracX_iX_1+dots+X_nbiggr] - fracE X_iE X_1 + dots + E X_nbiggr| leq sum_j=1^n Efracsigma_jX_-j leq frac1(n-1)msum_j=1^nsigma_j
$$
and a somewhat tighter $ell^1$ bound for the corresponding simplex vector
$$
biggl|Ebiggl[fracX_iX_1+dots+X_nbiggr] - fracE X_iE X_1 + dots + E X_nbiggr|_1 leq 2sum_j=1^n Efracsigma_jX_-j leq frac2(n-1)msum_j=1^nsigma_j
$$
Hopefully someone can make this bound in terms of $E X_j$ instead of the expected reciprocal so I won't accept this for a while.
Proof. Compute
$$
beginalign*
Ebiggl[fracX_iX_1+dots+X_nbiggr] - fracE X_iE X_1 + dots + E X_n &= Ebiggl[fracX_i(E X_1+dots + E X_n) - (X_1+dots+X_n)E X_i(X_1+dots+X_n)(E X_1 + dots + E X_n)biggr]\
&= Ebiggl[fracsum_jneq iX_iE X_j - X_jE X_i(X_1+dots+X_n)(E X_1 + dots + E X_n)biggr]\
&= frac1E X_1 + dots + E X_nsum_jneq i Ebiggl[fracX_iE X_j - X_j E X_i X_1 + dots + X_nbiggr].
endalign*
$$
Using Jensen we can bound
$$
beginalign*
biggl|EfracX_iE X_j - X_j E X_iX_1 + dots + X_nbiggr| &leq EfracX_1 + dots + X_n\
&leq EfracE X_j + X_1 + dots + X_n
endalign*
$$
and, by non-negativity and independence,
$$
beginalign*
EfracX_i - E X_iX_1 + dots + X_n &leq EfracX_i - E X_iX_1 + dots + X_i-1 + X_i+1 + dots + X_n\
&= EfracX_i - E X_iX_1 + dots + X_i-1 + X_i+1 + dots + X_n\
&= Efracsigma_i E X_jX_1 + dots + X_i-1 + X_i+1 + dots + X_n.
endalign*
$$
Hence
$$
beginalign*
biggl|Ebiggl[fracX_iX_1+dots+X_nbiggr] - fracE X_iE X_1 + dots + E X_nbiggr| leq Efracsigma_iX_-i + fracE X_iE X_1 + dots + E X_nsum_jneq i Efracsigma_jX_-jleq sum_j=1^n Efracsigma_jX_-j.
endalign*
$$
$newcommandEmathbbE$Denote $sigma_i = E|X_i - E X_i|$, $X_-k = X_1 + dots + X_k-1 + X_k+1 + dots + X_n$, and assume $X_i>0$ almost surely. We can prove the following Efron-Stein-looking inequality, with nicer bounds if we are guaranteed $X_igeq m>0$ almost surely:
$$
biggl|Ebiggl[fracX_iX_1+dots+X_nbiggr] - fracE X_iE X_1 + dots + E X_nbiggr| leq sum_j=1^n Efracsigma_jX_-j leq frac1(n-1)msum_j=1^nsigma_j
$$
and a somewhat tighter $ell^1$ bound for the corresponding simplex vector
$$
biggl|Ebiggl[fracX_iX_1+dots+X_nbiggr] - fracE X_iE X_1 + dots + E X_nbiggr|_1 leq 2sum_j=1^n Efracsigma_jX_-j leq frac2(n-1)msum_j=1^nsigma_j
$$
Hopefully someone can make this bound in terms of $E X_j$ instead of the expected reciprocal so I won't accept this for a while.
Proof. Compute
$$
beginalign*
Ebiggl[fracX_iX_1+dots+X_nbiggr] - fracE X_iE X_1 + dots + E X_n &= Ebiggl[fracX_i(E X_1+dots + E X_n) - (X_1+dots+X_n)E X_i(X_1+dots+X_n)(E X_1 + dots + E X_n)biggr]\
&= Ebiggl[fracsum_jneq iX_iE X_j - X_jE X_i(X_1+dots+X_n)(E X_1 + dots + E X_n)biggr]\
&= frac1E X_1 + dots + E X_nsum_jneq i Ebiggl[fracX_iE X_j - X_j E X_i X_1 + dots + X_nbiggr].
endalign*
$$
Using Jensen we can bound
$$
beginalign*
biggl|EfracX_iE X_j - X_j E X_iX_1 + dots + X_nbiggr| &leq EfracX_1 + dots + X_n\
&leq EfracE X_j + X_1 + dots + X_n
endalign*
$$
and, by non-negativity and independence,
$$
beginalign*
EfracX_i - E X_iX_1 + dots + X_n &leq EfracX_i - E X_iX_1 + dots + X_i-1 + X_i+1 + dots + X_n\
&= EfracX_i - E X_iX_1 + dots + X_i-1 + X_i+1 + dots + X_n\
&= Efracsigma_i E X_jX_1 + dots + X_i-1 + X_i+1 + dots + X_n.
endalign*
$$
Hence
$$
beginalign*
biggl|Ebiggl[fracX_iX_1+dots+X_nbiggr] - fracE X_iE X_1 + dots + E X_nbiggr| leq Efracsigma_iX_-i + fracE X_iE X_1 + dots + E X_nsum_jneq i Efracsigma_jX_-jleq sum_j=1^n Efracsigma_jX_-j.
endalign*
$$
answered 2 days ago
cdipaolo
367110
367110
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2868289%2fexpected-proportion-of-random-variables%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
1
Are the random variables non-negative? I think even for the first case the equation doesn´t hold without the non-negativity.
– callculus
Jul 31 at 18:07
@callculus Yes, sorry I should've specified that.
– cdipaolo
Jul 31 at 18:09