What does it mean that “the central limit theorem does not hold far away from the peak”?

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
3
down vote

favorite
1












So I know nothing about large deviations theory, and I'm reading some notes. They claim that:




The CLT does not hold far away from the peak




I am not sure how to parse this statement. There are many statements of the CLT but here is the one I know:




Let $X_n$ be a sequence of i.i.d. random variables with mean $0$ and variance $sigma^2<infty$. Then the following sum:



$$frac1sqrtnsum_k=1^n X_n$$



converges in distribution as $ntoinfty$ to $N(0,sigma^2)$.




Why do the notes say that central limit theorem doesn't hold away from $0$? There's nothing in the central limit theorem that says "only for some interval around $0$". Does it just mean that the convergence rate is very slow and impractical?







share|cite|improve this question



















  • It seems the point that they're trying to make is that while the unconditional distribution (or perhaps even conditional on events with probability close to $1$) of the weighted sum is "close" to normal, the distribution conditional on low-probability events is not.
    – Theoretical Economist
    Jul 25 at 15:58














up vote
3
down vote

favorite
1












So I know nothing about large deviations theory, and I'm reading some notes. They claim that:




The CLT does not hold far away from the peak




I am not sure how to parse this statement. There are many statements of the CLT but here is the one I know:




Let $X_n$ be a sequence of i.i.d. random variables with mean $0$ and variance $sigma^2<infty$. Then the following sum:



$$frac1sqrtnsum_k=1^n X_n$$



converges in distribution as $ntoinfty$ to $N(0,sigma^2)$.




Why do the notes say that central limit theorem doesn't hold away from $0$? There's nothing in the central limit theorem that says "only for some interval around $0$". Does it just mean that the convergence rate is very slow and impractical?







share|cite|improve this question



















  • It seems the point that they're trying to make is that while the unconditional distribution (or perhaps even conditional on events with probability close to $1$) of the weighted sum is "close" to normal, the distribution conditional on low-probability events is not.
    – Theoretical Economist
    Jul 25 at 15:58












up vote
3
down vote

favorite
1









up vote
3
down vote

favorite
1






1





So I know nothing about large deviations theory, and I'm reading some notes. They claim that:




The CLT does not hold far away from the peak




I am not sure how to parse this statement. There are many statements of the CLT but here is the one I know:




Let $X_n$ be a sequence of i.i.d. random variables with mean $0$ and variance $sigma^2<infty$. Then the following sum:



$$frac1sqrtnsum_k=1^n X_n$$



converges in distribution as $ntoinfty$ to $N(0,sigma^2)$.




Why do the notes say that central limit theorem doesn't hold away from $0$? There's nothing in the central limit theorem that says "only for some interval around $0$". Does it just mean that the convergence rate is very slow and impractical?







share|cite|improve this question











So I know nothing about large deviations theory, and I'm reading some notes. They claim that:




The CLT does not hold far away from the peak




I am not sure how to parse this statement. There are many statements of the CLT but here is the one I know:




Let $X_n$ be a sequence of i.i.d. random variables with mean $0$ and variance $sigma^2<infty$. Then the following sum:



$$frac1sqrtnsum_k=1^n X_n$$



converges in distribution as $ntoinfty$ to $N(0,sigma^2)$.




Why do the notes say that central limit theorem doesn't hold away from $0$? There's nothing in the central limit theorem that says "only for some interval around $0$". Does it just mean that the convergence rate is very slow and impractical?









share|cite|improve this question










share|cite|improve this question




share|cite|improve this question









asked Jul 25 at 15:37







user223391


















  • It seems the point that they're trying to make is that while the unconditional distribution (or perhaps even conditional on events with probability close to $1$) of the weighted sum is "close" to normal, the distribution conditional on low-probability events is not.
    – Theoretical Economist
    Jul 25 at 15:58
















  • It seems the point that they're trying to make is that while the unconditional distribution (or perhaps even conditional on events with probability close to $1$) of the weighted sum is "close" to normal, the distribution conditional on low-probability events is not.
    – Theoretical Economist
    Jul 25 at 15:58















It seems the point that they're trying to make is that while the unconditional distribution (or perhaps even conditional on events with probability close to $1$) of the weighted sum is "close" to normal, the distribution conditional on low-probability events is not.
– Theoretical Economist
Jul 25 at 15:58




It seems the point that they're trying to make is that while the unconditional distribution (or perhaps even conditional on events with probability close to $1$) of the weighted sum is "close" to normal, the distribution conditional on low-probability events is not.
– Theoretical Economist
Jul 25 at 15:58










1 Answer
1






active

oldest

votes

















up vote
1
down vote



accepted










Well, essentially it means that if you are far away from the mean (which in this case is 0), the 'approxmation' that the sum random variable (call it r.v. $Y$) you have is like the normal distribution becomes really bad.



If you want to understand this, just take some simple example. Let $n=10$ and say you have uniform random variables in range $[0,1]$. If you calculate something like $P(10000<Y<1000000)$ twice, once with the real distribution of $Y$ and another time with the normal 'approximation', you shall see that the difference is huge! Does that make sense?






share|cite|improve this answer





















  • Sure, in the first case you get $0$ in the second you get "not $0$". But if you let $n$ get big enough, the approximation gets a lot better, right? It's just a bad approximation for lower $n$. Is that right?
    – user223391
    Jul 25 at 16:11










  • Yes, that is exactly right. As n is larger, your approximation is better and better.
    – Vinayak Suresh
    Jul 25 at 16:20










Your Answer




StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: false,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);








 

draft saved


draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2862541%2fwhat-does-it-mean-that-the-central-limit-theorem-does-not-hold-far-away-from-th%23new-answer', 'question_page');

);

Post as a guest





























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes








up vote
1
down vote



accepted










Well, essentially it means that if you are far away from the mean (which in this case is 0), the 'approxmation' that the sum random variable (call it r.v. $Y$) you have is like the normal distribution becomes really bad.



If you want to understand this, just take some simple example. Let $n=10$ and say you have uniform random variables in range $[0,1]$. If you calculate something like $P(10000<Y<1000000)$ twice, once with the real distribution of $Y$ and another time with the normal 'approximation', you shall see that the difference is huge! Does that make sense?






share|cite|improve this answer





















  • Sure, in the first case you get $0$ in the second you get "not $0$". But if you let $n$ get big enough, the approximation gets a lot better, right? It's just a bad approximation for lower $n$. Is that right?
    – user223391
    Jul 25 at 16:11










  • Yes, that is exactly right. As n is larger, your approximation is better and better.
    – Vinayak Suresh
    Jul 25 at 16:20














up vote
1
down vote



accepted










Well, essentially it means that if you are far away from the mean (which in this case is 0), the 'approxmation' that the sum random variable (call it r.v. $Y$) you have is like the normal distribution becomes really bad.



If you want to understand this, just take some simple example. Let $n=10$ and say you have uniform random variables in range $[0,1]$. If you calculate something like $P(10000<Y<1000000)$ twice, once with the real distribution of $Y$ and another time with the normal 'approximation', you shall see that the difference is huge! Does that make sense?






share|cite|improve this answer





















  • Sure, in the first case you get $0$ in the second you get "not $0$". But if you let $n$ get big enough, the approximation gets a lot better, right? It's just a bad approximation for lower $n$. Is that right?
    – user223391
    Jul 25 at 16:11










  • Yes, that is exactly right. As n is larger, your approximation is better and better.
    – Vinayak Suresh
    Jul 25 at 16:20












up vote
1
down vote



accepted







up vote
1
down vote



accepted






Well, essentially it means that if you are far away from the mean (which in this case is 0), the 'approxmation' that the sum random variable (call it r.v. $Y$) you have is like the normal distribution becomes really bad.



If you want to understand this, just take some simple example. Let $n=10$ and say you have uniform random variables in range $[0,1]$. If you calculate something like $P(10000<Y<1000000)$ twice, once with the real distribution of $Y$ and another time with the normal 'approximation', you shall see that the difference is huge! Does that make sense?






share|cite|improve this answer













Well, essentially it means that if you are far away from the mean (which in this case is 0), the 'approxmation' that the sum random variable (call it r.v. $Y$) you have is like the normal distribution becomes really bad.



If you want to understand this, just take some simple example. Let $n=10$ and say you have uniform random variables in range $[0,1]$. If you calculate something like $P(10000<Y<1000000)$ twice, once with the real distribution of $Y$ and another time with the normal 'approximation', you shall see that the difference is huge! Does that make sense?







share|cite|improve this answer













share|cite|improve this answer



share|cite|improve this answer











answered Jul 25 at 16:06









Vinayak Suresh

414




414











  • Sure, in the first case you get $0$ in the second you get "not $0$". But if you let $n$ get big enough, the approximation gets a lot better, right? It's just a bad approximation for lower $n$. Is that right?
    – user223391
    Jul 25 at 16:11










  • Yes, that is exactly right. As n is larger, your approximation is better and better.
    – Vinayak Suresh
    Jul 25 at 16:20
















  • Sure, in the first case you get $0$ in the second you get "not $0$". But if you let $n$ get big enough, the approximation gets a lot better, right? It's just a bad approximation for lower $n$. Is that right?
    – user223391
    Jul 25 at 16:11










  • Yes, that is exactly right. As n is larger, your approximation is better and better.
    – Vinayak Suresh
    Jul 25 at 16:20















Sure, in the first case you get $0$ in the second you get "not $0$". But if you let $n$ get big enough, the approximation gets a lot better, right? It's just a bad approximation for lower $n$. Is that right?
– user223391
Jul 25 at 16:11




Sure, in the first case you get $0$ in the second you get "not $0$". But if you let $n$ get big enough, the approximation gets a lot better, right? It's just a bad approximation for lower $n$. Is that right?
– user223391
Jul 25 at 16:11












Yes, that is exactly right. As n is larger, your approximation is better and better.
– Vinayak Suresh
Jul 25 at 16:20




Yes, that is exactly right. As n is larger, your approximation is better and better.
– Vinayak Suresh
Jul 25 at 16:20












 

draft saved


draft discarded


























 


draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2862541%2fwhat-does-it-mean-that-the-central-limit-theorem-does-not-hold-far-away-from-th%23new-answer', 'question_page');

);

Post as a guest













































































Comments

Popular posts from this blog

What is the equation of a 3D cone with generalised tilt?

Color the edges and diagonals of a regular polygon

Relationship between determinant of matrix and determinant of adjoint?