Chernoff bound $ mathbbP leftlbrace Big| X - mathbbE(X) Big| > t rightrbrace $
Clash Royale CLAN TAG#URR8PPP
up vote
0
down vote
favorite
How to apply the Chernoff bound to upper bound the following probability:
$$
mathbbP leftlbrace Big| X - mathbbE(X) Big| > t rightrbrace
$$
where $X$ follows the distribution given as:
beginequation*
P(X=0)=0;
endequation*
beginequation*
P(X=m)=binomn-1m-1 p^m-1(1-p)^n-m mathrmfor m=1,...n ;
endequation*
This is not exactly a binomial distribution but very similar. The term $mathbbE(X)$ is the expected value of $X$ and $t >0$.
Thank you for your help.
probability sequences-and-series inequality binomial-distribution upper-lower-bounds
add a comment |Â
up vote
0
down vote
favorite
How to apply the Chernoff bound to upper bound the following probability:
$$
mathbbP leftlbrace Big| X - mathbbE(X) Big| > t rightrbrace
$$
where $X$ follows the distribution given as:
beginequation*
P(X=0)=0;
endequation*
beginequation*
P(X=m)=binomn-1m-1 p^m-1(1-p)^n-m mathrmfor m=1,...n ;
endequation*
This is not exactly a binomial distribution but very similar. The term $mathbbE(X)$ is the expected value of $X$ and $t >0$.
Thank you for your help.
probability sequences-and-series inequality binomial-distribution upper-lower-bounds
Oh yes sorry. I will fix it.
– Mounia Hamidouche
Jul 24 at 19:45
1
Is this $1$ more than a binomial random variable with parameters $n-1$ and $p$?
– Henry
Jul 24 at 19:49
This is a shift by one to avoid the possibility that the random variable X takes the value 0.
– Mounia Hamidouche
Jul 24 at 19:52
add a comment |Â
up vote
0
down vote
favorite
up vote
0
down vote
favorite
How to apply the Chernoff bound to upper bound the following probability:
$$
mathbbP leftlbrace Big| X - mathbbE(X) Big| > t rightrbrace
$$
where $X$ follows the distribution given as:
beginequation*
P(X=0)=0;
endequation*
beginequation*
P(X=m)=binomn-1m-1 p^m-1(1-p)^n-m mathrmfor m=1,...n ;
endequation*
This is not exactly a binomial distribution but very similar. The term $mathbbE(X)$ is the expected value of $X$ and $t >0$.
Thank you for your help.
probability sequences-and-series inequality binomial-distribution upper-lower-bounds
How to apply the Chernoff bound to upper bound the following probability:
$$
mathbbP leftlbrace Big| X - mathbbE(X) Big| > t rightrbrace
$$
where $X$ follows the distribution given as:
beginequation*
P(X=0)=0;
endequation*
beginequation*
P(X=m)=binomn-1m-1 p^m-1(1-p)^n-m mathrmfor m=1,...n ;
endequation*
This is not exactly a binomial distribution but very similar. The term $mathbbE(X)$ is the expected value of $X$ and $t >0$.
Thank you for your help.
probability sequences-and-series inequality binomial-distribution upper-lower-bounds
edited Jul 24 at 19:46
asked Jul 24 at 19:42
Mounia Hamidouche
31319
31319
Oh yes sorry. I will fix it.
– Mounia Hamidouche
Jul 24 at 19:45
1
Is this $1$ more than a binomial random variable with parameters $n-1$ and $p$?
– Henry
Jul 24 at 19:49
This is a shift by one to avoid the possibility that the random variable X takes the value 0.
– Mounia Hamidouche
Jul 24 at 19:52
add a comment |Â
Oh yes sorry. I will fix it.
– Mounia Hamidouche
Jul 24 at 19:45
1
Is this $1$ more than a binomial random variable with parameters $n-1$ and $p$?
– Henry
Jul 24 at 19:49
This is a shift by one to avoid the possibility that the random variable X takes the value 0.
– Mounia Hamidouche
Jul 24 at 19:52
Oh yes sorry. I will fix it.
– Mounia Hamidouche
Jul 24 at 19:45
Oh yes sorry. I will fix it.
– Mounia Hamidouche
Jul 24 at 19:45
1
1
Is this $1$ more than a binomial random variable with parameters $n-1$ and $p$?
– Henry
Jul 24 at 19:49
Is this $1$ more than a binomial random variable with parameters $n-1$ and $p$?
– Henry
Jul 24 at 19:49
This is a shift by one to avoid the possibility that the random variable X takes the value 0.
– Mounia Hamidouche
Jul 24 at 19:52
This is a shift by one to avoid the possibility that the random variable X takes the value 0.
– Mounia Hamidouche
Jul 24 at 19:52
add a comment |Â
1 Answer
1
active
oldest
votes
up vote
1
down vote
accepted
$X=Y+1$ where $Ysim Binom(n-1,p)$. So If $Y_1, Y_2, ldots$ are i.i.d. random variables, $Y_isim Bernoulli(p)$, then $Y=sumlimits_i=1^n-1 Y_i$.
Apply the Chernoff bound on $Y$.
The same estimation holds for $X$, as $X-E(X)sim Y-E(Y)$.
Thank you. Could you please explain why the estimation will hold for X and Y ? I can not see it.
– Mounia Hamidouche
Jul 24 at 20:58
What do you mean by two cases?
– A. Pongrácz
Jul 24 at 20:59
I edited the question.
– Mounia Hamidouche
Jul 24 at 21:07
The Chernoff bound specifically applies to sums of bounded, independent variables. So you can directly apply it to $Y$. Once you do it, you obtain some estimate for the probability $P(|Y-E(Y)|>t)$. In my post, I explained that the same estimate holds for $P(|X-E(X)|>t)$. So you don't want to apply the Chernoff bounds to $X$, as it is not possible directly. Apply it on $Y$.
– A. Pongrácz
Jul 24 at 21:12
Thank's, my concern is to know if saying that $X - mathbbE(X) sim Y - mathbbE(Y)$ is rigorous enough.
– Mounia Hamidouche
Jul 24 at 21:18
 |Â
show 2 more comments
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
accepted
$X=Y+1$ where $Ysim Binom(n-1,p)$. So If $Y_1, Y_2, ldots$ are i.i.d. random variables, $Y_isim Bernoulli(p)$, then $Y=sumlimits_i=1^n-1 Y_i$.
Apply the Chernoff bound on $Y$.
The same estimation holds for $X$, as $X-E(X)sim Y-E(Y)$.
Thank you. Could you please explain why the estimation will hold for X and Y ? I can not see it.
– Mounia Hamidouche
Jul 24 at 20:58
What do you mean by two cases?
– A. Pongrácz
Jul 24 at 20:59
I edited the question.
– Mounia Hamidouche
Jul 24 at 21:07
The Chernoff bound specifically applies to sums of bounded, independent variables. So you can directly apply it to $Y$. Once you do it, you obtain some estimate for the probability $P(|Y-E(Y)|>t)$. In my post, I explained that the same estimate holds for $P(|X-E(X)|>t)$. So you don't want to apply the Chernoff bounds to $X$, as it is not possible directly. Apply it on $Y$.
– A. Pongrácz
Jul 24 at 21:12
Thank's, my concern is to know if saying that $X - mathbbE(X) sim Y - mathbbE(Y)$ is rigorous enough.
– Mounia Hamidouche
Jul 24 at 21:18
 |Â
show 2 more comments
up vote
1
down vote
accepted
$X=Y+1$ where $Ysim Binom(n-1,p)$. So If $Y_1, Y_2, ldots$ are i.i.d. random variables, $Y_isim Bernoulli(p)$, then $Y=sumlimits_i=1^n-1 Y_i$.
Apply the Chernoff bound on $Y$.
The same estimation holds for $X$, as $X-E(X)sim Y-E(Y)$.
Thank you. Could you please explain why the estimation will hold for X and Y ? I can not see it.
– Mounia Hamidouche
Jul 24 at 20:58
What do you mean by two cases?
– A. Pongrácz
Jul 24 at 20:59
I edited the question.
– Mounia Hamidouche
Jul 24 at 21:07
The Chernoff bound specifically applies to sums of bounded, independent variables. So you can directly apply it to $Y$. Once you do it, you obtain some estimate for the probability $P(|Y-E(Y)|>t)$. In my post, I explained that the same estimate holds for $P(|X-E(X)|>t)$. So you don't want to apply the Chernoff bounds to $X$, as it is not possible directly. Apply it on $Y$.
– A. Pongrácz
Jul 24 at 21:12
Thank's, my concern is to know if saying that $X - mathbbE(X) sim Y - mathbbE(Y)$ is rigorous enough.
– Mounia Hamidouche
Jul 24 at 21:18
 |Â
show 2 more comments
up vote
1
down vote
accepted
up vote
1
down vote
accepted
$X=Y+1$ where $Ysim Binom(n-1,p)$. So If $Y_1, Y_2, ldots$ are i.i.d. random variables, $Y_isim Bernoulli(p)$, then $Y=sumlimits_i=1^n-1 Y_i$.
Apply the Chernoff bound on $Y$.
The same estimation holds for $X$, as $X-E(X)sim Y-E(Y)$.
$X=Y+1$ where $Ysim Binom(n-1,p)$. So If $Y_1, Y_2, ldots$ are i.i.d. random variables, $Y_isim Bernoulli(p)$, then $Y=sumlimits_i=1^n-1 Y_i$.
Apply the Chernoff bound on $Y$.
The same estimation holds for $X$, as $X-E(X)sim Y-E(Y)$.
answered Jul 24 at 19:58


A. Pongrácz
1,804116
1,804116
Thank you. Could you please explain why the estimation will hold for X and Y ? I can not see it.
– Mounia Hamidouche
Jul 24 at 20:58
What do you mean by two cases?
– A. Pongrácz
Jul 24 at 20:59
I edited the question.
– Mounia Hamidouche
Jul 24 at 21:07
The Chernoff bound specifically applies to sums of bounded, independent variables. So you can directly apply it to $Y$. Once you do it, you obtain some estimate for the probability $P(|Y-E(Y)|>t)$. In my post, I explained that the same estimate holds for $P(|X-E(X)|>t)$. So you don't want to apply the Chernoff bounds to $X$, as it is not possible directly. Apply it on $Y$.
– A. Pongrácz
Jul 24 at 21:12
Thank's, my concern is to know if saying that $X - mathbbE(X) sim Y - mathbbE(Y)$ is rigorous enough.
– Mounia Hamidouche
Jul 24 at 21:18
 |Â
show 2 more comments
Thank you. Could you please explain why the estimation will hold for X and Y ? I can not see it.
– Mounia Hamidouche
Jul 24 at 20:58
What do you mean by two cases?
– A. Pongrácz
Jul 24 at 20:59
I edited the question.
– Mounia Hamidouche
Jul 24 at 21:07
The Chernoff bound specifically applies to sums of bounded, independent variables. So you can directly apply it to $Y$. Once you do it, you obtain some estimate for the probability $P(|Y-E(Y)|>t)$. In my post, I explained that the same estimate holds for $P(|X-E(X)|>t)$. So you don't want to apply the Chernoff bounds to $X$, as it is not possible directly. Apply it on $Y$.
– A. Pongrácz
Jul 24 at 21:12
Thank's, my concern is to know if saying that $X - mathbbE(X) sim Y - mathbbE(Y)$ is rigorous enough.
– Mounia Hamidouche
Jul 24 at 21:18
Thank you. Could you please explain why the estimation will hold for X and Y ? I can not see it.
– Mounia Hamidouche
Jul 24 at 20:58
Thank you. Could you please explain why the estimation will hold for X and Y ? I can not see it.
– Mounia Hamidouche
Jul 24 at 20:58
What do you mean by two cases?
– A. Pongrácz
Jul 24 at 20:59
What do you mean by two cases?
– A. Pongrácz
Jul 24 at 20:59
I edited the question.
– Mounia Hamidouche
Jul 24 at 21:07
I edited the question.
– Mounia Hamidouche
Jul 24 at 21:07
The Chernoff bound specifically applies to sums of bounded, independent variables. So you can directly apply it to $Y$. Once you do it, you obtain some estimate for the probability $P(|Y-E(Y)|>t)$. In my post, I explained that the same estimate holds for $P(|X-E(X)|>t)$. So you don't want to apply the Chernoff bounds to $X$, as it is not possible directly. Apply it on $Y$.
– A. Pongrácz
Jul 24 at 21:12
The Chernoff bound specifically applies to sums of bounded, independent variables. So you can directly apply it to $Y$. Once you do it, you obtain some estimate for the probability $P(|Y-E(Y)|>t)$. In my post, I explained that the same estimate holds for $P(|X-E(X)|>t)$. So you don't want to apply the Chernoff bounds to $X$, as it is not possible directly. Apply it on $Y$.
– A. Pongrácz
Jul 24 at 21:12
Thank's, my concern is to know if saying that $X - mathbbE(X) sim Y - mathbbE(Y)$ is rigorous enough.
– Mounia Hamidouche
Jul 24 at 21:18
Thank's, my concern is to know if saying that $X - mathbbE(X) sim Y - mathbbE(Y)$ is rigorous enough.
– Mounia Hamidouche
Jul 24 at 21:18
 |Â
show 2 more comments
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2861695%2fchernoff-bound-mathbbp-left-lbrace-big-x-mathbbex-big-t-righ%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Oh yes sorry. I will fix it.
– Mounia Hamidouche
Jul 24 at 19:45
1
Is this $1$ more than a binomial random variable with parameters $n-1$ and $p$?
– Henry
Jul 24 at 19:49
This is a shift by one to avoid the possibility that the random variable X takes the value 0.
– Mounia Hamidouche
Jul 24 at 19:52