Sums of realizations of dependent variables become independent

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
2
down vote

favorite












I have empirically noticed and interesting phenomenon. Suppose we have two continuous random variables $X$ and $Y$ which are dependent but not correlated. For instance:



$X sim mathcalN(0,1)$



$Y = cos(X) +Z$,$~~~~$ where $Zsim mathcalN(0,0.5)$



Here you can see some samples generated for these variables. I use this implementation of distance correlation as a measure for dependence:





Now consider two additional random variables $X_sum$ and $Y_sum$ generated as the summation of $n$ realizations of the original variables respectively. The thing I have noticed is that $X_sum$ and $Y_sum$ become more and more independent as $n$ grows (click on the image to enlarge):



enter image description here



I know that by the Central Limit Theorem $X_sum$ and $Y_sum$ will be approximately normal, but why are they becoming independent? Is there a general explanation for this?. Also, I have noticed that this phenomenon only occurs if $X$ and $Y$ are not correlated.



Thanks in advance.







share|cite|improve this question























    up vote
    2
    down vote

    favorite












    I have empirically noticed and interesting phenomenon. Suppose we have two continuous random variables $X$ and $Y$ which are dependent but not correlated. For instance:



    $X sim mathcalN(0,1)$



    $Y = cos(X) +Z$,$~~~~$ where $Zsim mathcalN(0,0.5)$



    Here you can see some samples generated for these variables. I use this implementation of distance correlation as a measure for dependence:





    Now consider two additional random variables $X_sum$ and $Y_sum$ generated as the summation of $n$ realizations of the original variables respectively. The thing I have noticed is that $X_sum$ and $Y_sum$ become more and more independent as $n$ grows (click on the image to enlarge):



    enter image description here



    I know that by the Central Limit Theorem $X_sum$ and $Y_sum$ will be approximately normal, but why are they becoming independent? Is there a general explanation for this?. Also, I have noticed that this phenomenon only occurs if $X$ and $Y$ are not correlated.



    Thanks in advance.







    share|cite|improve this question





















      up vote
      2
      down vote

      favorite









      up vote
      2
      down vote

      favorite











      I have empirically noticed and interesting phenomenon. Suppose we have two continuous random variables $X$ and $Y$ which are dependent but not correlated. For instance:



      $X sim mathcalN(0,1)$



      $Y = cos(X) +Z$,$~~~~$ where $Zsim mathcalN(0,0.5)$



      Here you can see some samples generated for these variables. I use this implementation of distance correlation as a measure for dependence:





      Now consider two additional random variables $X_sum$ and $Y_sum$ generated as the summation of $n$ realizations of the original variables respectively. The thing I have noticed is that $X_sum$ and $Y_sum$ become more and more independent as $n$ grows (click on the image to enlarge):



      enter image description here



      I know that by the Central Limit Theorem $X_sum$ and $Y_sum$ will be approximately normal, but why are they becoming independent? Is there a general explanation for this?. Also, I have noticed that this phenomenon only occurs if $X$ and $Y$ are not correlated.



      Thanks in advance.







      share|cite|improve this question











      I have empirically noticed and interesting phenomenon. Suppose we have two continuous random variables $X$ and $Y$ which are dependent but not correlated. For instance:



      $X sim mathcalN(0,1)$



      $Y = cos(X) +Z$,$~~~~$ where $Zsim mathcalN(0,0.5)$



      Here you can see some samples generated for these variables. I use this implementation of distance correlation as a measure for dependence:





      Now consider two additional random variables $X_sum$ and $Y_sum$ generated as the summation of $n$ realizations of the original variables respectively. The thing I have noticed is that $X_sum$ and $Y_sum$ become more and more independent as $n$ grows (click on the image to enlarge):



      enter image description here



      I know that by the Central Limit Theorem $X_sum$ and $Y_sum$ will be approximately normal, but why are they becoming independent? Is there a general explanation for this?. Also, I have noticed that this phenomenon only occurs if $X$ and $Y$ are not correlated.



      Thanks in advance.









      share|cite|improve this question










      share|cite|improve this question




      share|cite|improve this question









      asked Jul 25 at 22:10









      Daniel López

      1163




      1163




















          1 Answer
          1






          active

          oldest

          votes

















          up vote
          1
          down vote













          The multidimensional CLT implies that
          $$sqrtn beginbmatrix frac1n X_sum - mathbbE X \ frac1n Y_sum - mathbbE Y endbmatrix$$
          is approximately bivariate normal with mean zero and covariance matrix
          $$beginbmatrixtextVar(X) & textCov(X,Y) \ textCov(X,Y) & textVar(Y)endbmatrix.$$
          Since your $X$ and $Y$ are uncorrelated, we have $textCov(X_sum/sqrtn, Y_sum/sqrtn) approx 0$.



          I am not sure how to quantify what happens when you scale up by $sqrtn$ to consider $textCov(X_sum, Y_sum)$; you may need a Berry-Esseen type of non-asymptotic result.






          share|cite|improve this answer





















          • Thanks for your answer. That might explain the near cero correlation, but I am more interested in the increasingly lower dependence.
            – Daniel López
            Jul 25 at 23:08






          • 1




            But now that I think about it, if two variables follow a bivariate normal with cero co-variance, they are by definition independent right?
            – Daniel López
            Jul 25 at 23:20










          Your Answer




          StackExchange.ifUsing("editor", function ()
          return StackExchange.using("mathjaxEditing", function ()
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          );
          );
          , "mathjax-editing");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "69"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          convertImagesToLinks: true,
          noModals: false,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );








           

          draft saved


          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2862875%2fsums-of-realizations-of-dependent-variables-become-independent%23new-answer', 'question_page');

          );

          Post as a guest






























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes








          up vote
          1
          down vote













          The multidimensional CLT implies that
          $$sqrtn beginbmatrix frac1n X_sum - mathbbE X \ frac1n Y_sum - mathbbE Y endbmatrix$$
          is approximately bivariate normal with mean zero and covariance matrix
          $$beginbmatrixtextVar(X) & textCov(X,Y) \ textCov(X,Y) & textVar(Y)endbmatrix.$$
          Since your $X$ and $Y$ are uncorrelated, we have $textCov(X_sum/sqrtn, Y_sum/sqrtn) approx 0$.



          I am not sure how to quantify what happens when you scale up by $sqrtn$ to consider $textCov(X_sum, Y_sum)$; you may need a Berry-Esseen type of non-asymptotic result.






          share|cite|improve this answer





















          • Thanks for your answer. That might explain the near cero correlation, but I am more interested in the increasingly lower dependence.
            – Daniel López
            Jul 25 at 23:08






          • 1




            But now that I think about it, if two variables follow a bivariate normal with cero co-variance, they are by definition independent right?
            – Daniel López
            Jul 25 at 23:20














          up vote
          1
          down vote













          The multidimensional CLT implies that
          $$sqrtn beginbmatrix frac1n X_sum - mathbbE X \ frac1n Y_sum - mathbbE Y endbmatrix$$
          is approximately bivariate normal with mean zero and covariance matrix
          $$beginbmatrixtextVar(X) & textCov(X,Y) \ textCov(X,Y) & textVar(Y)endbmatrix.$$
          Since your $X$ and $Y$ are uncorrelated, we have $textCov(X_sum/sqrtn, Y_sum/sqrtn) approx 0$.



          I am not sure how to quantify what happens when you scale up by $sqrtn$ to consider $textCov(X_sum, Y_sum)$; you may need a Berry-Esseen type of non-asymptotic result.






          share|cite|improve this answer





















          • Thanks for your answer. That might explain the near cero correlation, but I am more interested in the increasingly lower dependence.
            – Daniel López
            Jul 25 at 23:08






          • 1




            But now that I think about it, if two variables follow a bivariate normal with cero co-variance, they are by definition independent right?
            – Daniel López
            Jul 25 at 23:20












          up vote
          1
          down vote










          up vote
          1
          down vote









          The multidimensional CLT implies that
          $$sqrtn beginbmatrix frac1n X_sum - mathbbE X \ frac1n Y_sum - mathbbE Y endbmatrix$$
          is approximately bivariate normal with mean zero and covariance matrix
          $$beginbmatrixtextVar(X) & textCov(X,Y) \ textCov(X,Y) & textVar(Y)endbmatrix.$$
          Since your $X$ and $Y$ are uncorrelated, we have $textCov(X_sum/sqrtn, Y_sum/sqrtn) approx 0$.



          I am not sure how to quantify what happens when you scale up by $sqrtn$ to consider $textCov(X_sum, Y_sum)$; you may need a Berry-Esseen type of non-asymptotic result.






          share|cite|improve this answer













          The multidimensional CLT implies that
          $$sqrtn beginbmatrix frac1n X_sum - mathbbE X \ frac1n Y_sum - mathbbE Y endbmatrix$$
          is approximately bivariate normal with mean zero and covariance matrix
          $$beginbmatrixtextVar(X) & textCov(X,Y) \ textCov(X,Y) & textVar(Y)endbmatrix.$$
          Since your $X$ and $Y$ are uncorrelated, we have $textCov(X_sum/sqrtn, Y_sum/sqrtn) approx 0$.



          I am not sure how to quantify what happens when you scale up by $sqrtn$ to consider $textCov(X_sum, Y_sum)$; you may need a Berry-Esseen type of non-asymptotic result.







          share|cite|improve this answer













          share|cite|improve this answer



          share|cite|improve this answer











          answered Jul 25 at 22:48









          angryavian

          34.5k12874




          34.5k12874











          • Thanks for your answer. That might explain the near cero correlation, but I am more interested in the increasingly lower dependence.
            – Daniel López
            Jul 25 at 23:08






          • 1




            But now that I think about it, if two variables follow a bivariate normal with cero co-variance, they are by definition independent right?
            – Daniel López
            Jul 25 at 23:20
















          • Thanks for your answer. That might explain the near cero correlation, but I am more interested in the increasingly lower dependence.
            – Daniel López
            Jul 25 at 23:08






          • 1




            But now that I think about it, if two variables follow a bivariate normal with cero co-variance, they are by definition independent right?
            – Daniel López
            Jul 25 at 23:20















          Thanks for your answer. That might explain the near cero correlation, but I am more interested in the increasingly lower dependence.
          – Daniel López
          Jul 25 at 23:08




          Thanks for your answer. That might explain the near cero correlation, but I am more interested in the increasingly lower dependence.
          – Daniel López
          Jul 25 at 23:08




          1




          1




          But now that I think about it, if two variables follow a bivariate normal with cero co-variance, they are by definition independent right?
          – Daniel López
          Jul 25 at 23:20




          But now that I think about it, if two variables follow a bivariate normal with cero co-variance, they are by definition independent right?
          – Daniel López
          Jul 25 at 23:20












           

          draft saved


          draft discarded


























           


          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2862875%2fsums-of-realizations-of-dependent-variables-become-independent%23new-answer', 'question_page');

          );

          Post as a guest













































































          Comments

          Popular posts from this blog

          What is the equation of a 3D cone with generalised tilt?

          Color the edges and diagonals of a regular polygon

          Relationship between determinant of matrix and determinant of adjoint?