Chernoff bound $ mathbbP leftlbrace Big| X - mathbbE(X) Big| > t rightrbrace $

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
0
down vote

favorite












How to apply the Chernoff bound to upper bound the following probability:



$$
mathbbP leftlbrace Big| X - mathbbE(X) Big| > t rightrbrace
$$



where $X$ follows the distribution given as:
beginequation*
P(X=0)=0;
endequation*
beginequation*
P(X=m)=binomn-1m-1 p^m-1(1-p)^n-m mathrmfor m=1,...n ;
endequation*



This is not exactly a binomial distribution but very similar. The term $mathbbE(X)$ is the expected value of $X$ and $t >0$.



Thank you for your help.







share|cite|improve this question





















  • Oh yes sorry. I will fix it.
    – Mounia Hamidouche
    Jul 24 at 19:45






  • 1




    Is this $1$ more than a binomial random variable with parameters $n-1$ and $p$?
    – Henry
    Jul 24 at 19:49











  • This is a shift by one to avoid the possibility that the random variable X takes the value 0.
    – Mounia Hamidouche
    Jul 24 at 19:52














up vote
0
down vote

favorite












How to apply the Chernoff bound to upper bound the following probability:



$$
mathbbP leftlbrace Big| X - mathbbE(X) Big| > t rightrbrace
$$



where $X$ follows the distribution given as:
beginequation*
P(X=0)=0;
endequation*
beginequation*
P(X=m)=binomn-1m-1 p^m-1(1-p)^n-m mathrmfor m=1,...n ;
endequation*



This is not exactly a binomial distribution but very similar. The term $mathbbE(X)$ is the expected value of $X$ and $t >0$.



Thank you for your help.







share|cite|improve this question





















  • Oh yes sorry. I will fix it.
    – Mounia Hamidouche
    Jul 24 at 19:45






  • 1




    Is this $1$ more than a binomial random variable with parameters $n-1$ and $p$?
    – Henry
    Jul 24 at 19:49











  • This is a shift by one to avoid the possibility that the random variable X takes the value 0.
    – Mounia Hamidouche
    Jul 24 at 19:52












up vote
0
down vote

favorite









up vote
0
down vote

favorite











How to apply the Chernoff bound to upper bound the following probability:



$$
mathbbP leftlbrace Big| X - mathbbE(X) Big| > t rightrbrace
$$



where $X$ follows the distribution given as:
beginequation*
P(X=0)=0;
endequation*
beginequation*
P(X=m)=binomn-1m-1 p^m-1(1-p)^n-m mathrmfor m=1,...n ;
endequation*



This is not exactly a binomial distribution but very similar. The term $mathbbE(X)$ is the expected value of $X$ and $t >0$.



Thank you for your help.







share|cite|improve this question













How to apply the Chernoff bound to upper bound the following probability:



$$
mathbbP leftlbrace Big| X - mathbbE(X) Big| > t rightrbrace
$$



where $X$ follows the distribution given as:
beginequation*
P(X=0)=0;
endequation*
beginequation*
P(X=m)=binomn-1m-1 p^m-1(1-p)^n-m mathrmfor m=1,...n ;
endequation*



This is not exactly a binomial distribution but very similar. The term $mathbbE(X)$ is the expected value of $X$ and $t >0$.



Thank you for your help.









share|cite|improve this question












share|cite|improve this question




share|cite|improve this question








edited Jul 24 at 19:46
























asked Jul 24 at 19:42









Mounia Hamidouche

31319




31319











  • Oh yes sorry. I will fix it.
    – Mounia Hamidouche
    Jul 24 at 19:45






  • 1




    Is this $1$ more than a binomial random variable with parameters $n-1$ and $p$?
    – Henry
    Jul 24 at 19:49











  • This is a shift by one to avoid the possibility that the random variable X takes the value 0.
    – Mounia Hamidouche
    Jul 24 at 19:52
















  • Oh yes sorry. I will fix it.
    – Mounia Hamidouche
    Jul 24 at 19:45






  • 1




    Is this $1$ more than a binomial random variable with parameters $n-1$ and $p$?
    – Henry
    Jul 24 at 19:49











  • This is a shift by one to avoid the possibility that the random variable X takes the value 0.
    – Mounia Hamidouche
    Jul 24 at 19:52















Oh yes sorry. I will fix it.
– Mounia Hamidouche
Jul 24 at 19:45




Oh yes sorry. I will fix it.
– Mounia Hamidouche
Jul 24 at 19:45




1




1




Is this $1$ more than a binomial random variable with parameters $n-1$ and $p$?
– Henry
Jul 24 at 19:49





Is this $1$ more than a binomial random variable with parameters $n-1$ and $p$?
– Henry
Jul 24 at 19:49













This is a shift by one to avoid the possibility that the random variable X takes the value 0.
– Mounia Hamidouche
Jul 24 at 19:52




This is a shift by one to avoid the possibility that the random variable X takes the value 0.
– Mounia Hamidouche
Jul 24 at 19:52










1 Answer
1






active

oldest

votes

















up vote
1
down vote



accepted










$X=Y+1$ where $Ysim Binom(n-1,p)$. So If $Y_1, Y_2, ldots$ are i.i.d. random variables, $Y_isim Bernoulli(p)$, then $Y=sumlimits_i=1^n-1 Y_i$.
Apply the Chernoff bound on $Y$.
The same estimation holds for $X$, as $X-E(X)sim Y-E(Y)$.






share|cite|improve this answer





















  • Thank you. Could you please explain why the estimation will hold for X and Y ? I can not see it.
    – Mounia Hamidouche
    Jul 24 at 20:58











  • What do you mean by two cases?
    – A. Pongrácz
    Jul 24 at 20:59










  • I edited the question.
    – Mounia Hamidouche
    Jul 24 at 21:07










  • The Chernoff bound specifically applies to sums of bounded, independent variables. So you can directly apply it to $Y$. Once you do it, you obtain some estimate for the probability $P(|Y-E(Y)|>t)$. In my post, I explained that the same estimate holds for $P(|X-E(X)|>t)$. So you don't want to apply the Chernoff bounds to $X$, as it is not possible directly. Apply it on $Y$.
    – A. Pongrácz
    Jul 24 at 21:12










  • Thank's, my concern is to know if saying that $X - mathbbE(X) sim Y - mathbbE(Y)$ is rigorous enough.
    – Mounia Hamidouche
    Jul 24 at 21:18










Your Answer




StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: false,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);








 

draft saved


draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2861695%2fchernoff-bound-mathbbp-left-lbrace-big-x-mathbbex-big-t-righ%23new-answer', 'question_page');

);

Post as a guest






























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes








up vote
1
down vote



accepted










$X=Y+1$ where $Ysim Binom(n-1,p)$. So If $Y_1, Y_2, ldots$ are i.i.d. random variables, $Y_isim Bernoulli(p)$, then $Y=sumlimits_i=1^n-1 Y_i$.
Apply the Chernoff bound on $Y$.
The same estimation holds for $X$, as $X-E(X)sim Y-E(Y)$.






share|cite|improve this answer





















  • Thank you. Could you please explain why the estimation will hold for X and Y ? I can not see it.
    – Mounia Hamidouche
    Jul 24 at 20:58











  • What do you mean by two cases?
    – A. Pongrácz
    Jul 24 at 20:59










  • I edited the question.
    – Mounia Hamidouche
    Jul 24 at 21:07










  • The Chernoff bound specifically applies to sums of bounded, independent variables. So you can directly apply it to $Y$. Once you do it, you obtain some estimate for the probability $P(|Y-E(Y)|>t)$. In my post, I explained that the same estimate holds for $P(|X-E(X)|>t)$. So you don't want to apply the Chernoff bounds to $X$, as it is not possible directly. Apply it on $Y$.
    – A. Pongrácz
    Jul 24 at 21:12










  • Thank's, my concern is to know if saying that $X - mathbbE(X) sim Y - mathbbE(Y)$ is rigorous enough.
    – Mounia Hamidouche
    Jul 24 at 21:18














up vote
1
down vote



accepted










$X=Y+1$ where $Ysim Binom(n-1,p)$. So If $Y_1, Y_2, ldots$ are i.i.d. random variables, $Y_isim Bernoulli(p)$, then $Y=sumlimits_i=1^n-1 Y_i$.
Apply the Chernoff bound on $Y$.
The same estimation holds for $X$, as $X-E(X)sim Y-E(Y)$.






share|cite|improve this answer





















  • Thank you. Could you please explain why the estimation will hold for X and Y ? I can not see it.
    – Mounia Hamidouche
    Jul 24 at 20:58











  • What do you mean by two cases?
    – A. Pongrácz
    Jul 24 at 20:59










  • I edited the question.
    – Mounia Hamidouche
    Jul 24 at 21:07










  • The Chernoff bound specifically applies to sums of bounded, independent variables. So you can directly apply it to $Y$. Once you do it, you obtain some estimate for the probability $P(|Y-E(Y)|>t)$. In my post, I explained that the same estimate holds for $P(|X-E(X)|>t)$. So you don't want to apply the Chernoff bounds to $X$, as it is not possible directly. Apply it on $Y$.
    – A. Pongrácz
    Jul 24 at 21:12










  • Thank's, my concern is to know if saying that $X - mathbbE(X) sim Y - mathbbE(Y)$ is rigorous enough.
    – Mounia Hamidouche
    Jul 24 at 21:18












up vote
1
down vote



accepted







up vote
1
down vote



accepted






$X=Y+1$ where $Ysim Binom(n-1,p)$. So If $Y_1, Y_2, ldots$ are i.i.d. random variables, $Y_isim Bernoulli(p)$, then $Y=sumlimits_i=1^n-1 Y_i$.
Apply the Chernoff bound on $Y$.
The same estimation holds for $X$, as $X-E(X)sim Y-E(Y)$.






share|cite|improve this answer













$X=Y+1$ where $Ysim Binom(n-1,p)$. So If $Y_1, Y_2, ldots$ are i.i.d. random variables, $Y_isim Bernoulli(p)$, then $Y=sumlimits_i=1^n-1 Y_i$.
Apply the Chernoff bound on $Y$.
The same estimation holds for $X$, as $X-E(X)sim Y-E(Y)$.







share|cite|improve this answer













share|cite|improve this answer



share|cite|improve this answer











answered Jul 24 at 19:58









A. Pongrácz

1,804116




1,804116











  • Thank you. Could you please explain why the estimation will hold for X and Y ? I can not see it.
    – Mounia Hamidouche
    Jul 24 at 20:58











  • What do you mean by two cases?
    – A. Pongrácz
    Jul 24 at 20:59










  • I edited the question.
    – Mounia Hamidouche
    Jul 24 at 21:07










  • The Chernoff bound specifically applies to sums of bounded, independent variables. So you can directly apply it to $Y$. Once you do it, you obtain some estimate for the probability $P(|Y-E(Y)|>t)$. In my post, I explained that the same estimate holds for $P(|X-E(X)|>t)$. So you don't want to apply the Chernoff bounds to $X$, as it is not possible directly. Apply it on $Y$.
    – A. Pongrácz
    Jul 24 at 21:12










  • Thank's, my concern is to know if saying that $X - mathbbE(X) sim Y - mathbbE(Y)$ is rigorous enough.
    – Mounia Hamidouche
    Jul 24 at 21:18
















  • Thank you. Could you please explain why the estimation will hold for X and Y ? I can not see it.
    – Mounia Hamidouche
    Jul 24 at 20:58











  • What do you mean by two cases?
    – A. Pongrácz
    Jul 24 at 20:59










  • I edited the question.
    – Mounia Hamidouche
    Jul 24 at 21:07










  • The Chernoff bound specifically applies to sums of bounded, independent variables. So you can directly apply it to $Y$. Once you do it, you obtain some estimate for the probability $P(|Y-E(Y)|>t)$. In my post, I explained that the same estimate holds for $P(|X-E(X)|>t)$. So you don't want to apply the Chernoff bounds to $X$, as it is not possible directly. Apply it on $Y$.
    – A. Pongrácz
    Jul 24 at 21:12










  • Thank's, my concern is to know if saying that $X - mathbbE(X) sim Y - mathbbE(Y)$ is rigorous enough.
    – Mounia Hamidouche
    Jul 24 at 21:18















Thank you. Could you please explain why the estimation will hold for X and Y ? I can not see it.
– Mounia Hamidouche
Jul 24 at 20:58





Thank you. Could you please explain why the estimation will hold for X and Y ? I can not see it.
– Mounia Hamidouche
Jul 24 at 20:58













What do you mean by two cases?
– A. Pongrácz
Jul 24 at 20:59




What do you mean by two cases?
– A. Pongrácz
Jul 24 at 20:59












I edited the question.
– Mounia Hamidouche
Jul 24 at 21:07




I edited the question.
– Mounia Hamidouche
Jul 24 at 21:07












The Chernoff bound specifically applies to sums of bounded, independent variables. So you can directly apply it to $Y$. Once you do it, you obtain some estimate for the probability $P(|Y-E(Y)|>t)$. In my post, I explained that the same estimate holds for $P(|X-E(X)|>t)$. So you don't want to apply the Chernoff bounds to $X$, as it is not possible directly. Apply it on $Y$.
– A. Pongrácz
Jul 24 at 21:12




The Chernoff bound specifically applies to sums of bounded, independent variables. So you can directly apply it to $Y$. Once you do it, you obtain some estimate for the probability $P(|Y-E(Y)|>t)$. In my post, I explained that the same estimate holds for $P(|X-E(X)|>t)$. So you don't want to apply the Chernoff bounds to $X$, as it is not possible directly. Apply it on $Y$.
– A. Pongrácz
Jul 24 at 21:12












Thank's, my concern is to know if saying that $X - mathbbE(X) sim Y - mathbbE(Y)$ is rigorous enough.
– Mounia Hamidouche
Jul 24 at 21:18




Thank's, my concern is to know if saying that $X - mathbbE(X) sim Y - mathbbE(Y)$ is rigorous enough.
– Mounia Hamidouche
Jul 24 at 21:18












 

draft saved


draft discarded


























 


draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2861695%2fchernoff-bound-mathbbp-left-lbrace-big-x-mathbbex-big-t-righ%23new-answer', 'question_page');

);

Post as a guest













































































Comments

Popular posts from this blog

What is the equation of a 3D cone with generalised tilt?

Color the edges and diagonals of a regular polygon

Relationship between determinant of matrix and determinant of adjoint?