Applying Ascoli's theorem to the space of continuous cumulative distribution functions

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
1
down vote

favorite












I want to show that the space of continuous,strictly increasing commutative distribution functions are relatively compact. Though, I am not sure if I can do this.



Firstly, I consider the space of continuous, strictly increasing cumulative distribution functions on a compact interval $E_1$. Then, by Ascoli's theorem, this space is relatively compact. Now, I apply Ascoli's theorem again on a compact interval $E_2supset E_1$, and so on, $E_nsupset E_n-1$.



The space of strictly increasing, continuous cumulative distribution functions on $E_n$ is relatively compact for all $n$. Hence this space is relatively compact on $mathbbR$.



Can I make such an argument?







share|cite|improve this question

























    up vote
    1
    down vote

    favorite












    I want to show that the space of continuous,strictly increasing commutative distribution functions are relatively compact. Though, I am not sure if I can do this.



    Firstly, I consider the space of continuous, strictly increasing cumulative distribution functions on a compact interval $E_1$. Then, by Ascoli's theorem, this space is relatively compact. Now, I apply Ascoli's theorem again on a compact interval $E_2supset E_1$, and so on, $E_nsupset E_n-1$.



    The space of strictly increasing, continuous cumulative distribution functions on $E_n$ is relatively compact for all $n$. Hence this space is relatively compact on $mathbbR$.



    Can I make such an argument?







    share|cite|improve this question























      up vote
      1
      down vote

      favorite









      up vote
      1
      down vote

      favorite











      I want to show that the space of continuous,strictly increasing commutative distribution functions are relatively compact. Though, I am not sure if I can do this.



      Firstly, I consider the space of continuous, strictly increasing cumulative distribution functions on a compact interval $E_1$. Then, by Ascoli's theorem, this space is relatively compact. Now, I apply Ascoli's theorem again on a compact interval $E_2supset E_1$, and so on, $E_nsupset E_n-1$.



      The space of strictly increasing, continuous cumulative distribution functions on $E_n$ is relatively compact for all $n$. Hence this space is relatively compact on $mathbbR$.



      Can I make such an argument?







      share|cite|improve this question













      I want to show that the space of continuous,strictly increasing commutative distribution functions are relatively compact. Though, I am not sure if I can do this.



      Firstly, I consider the space of continuous, strictly increasing cumulative distribution functions on a compact interval $E_1$. Then, by Ascoli's theorem, this space is relatively compact. Now, I apply Ascoli's theorem again on a compact interval $E_2supset E_1$, and so on, $E_nsupset E_n-1$.



      The space of strictly increasing, continuous cumulative distribution functions on $E_n$ is relatively compact for all $n$. Hence this space is relatively compact on $mathbbR$.



      Can I make such an argument?









      share|cite|improve this question












      share|cite|improve this question




      share|cite|improve this question








      edited Jul 28 at 9:00









      Bernard

      110k635102




      110k635102









      asked Jul 28 at 4:21









      user1292919

      650512




      650512




















          2 Answers
          2






          active

          oldest

          votes

















          up vote
          1
          down vote













          No. First Arzela's theorem requires the compactness; localizing as you suggest results in uniform convergence on compact subsets which is not really what you want. More to the point, the local result fails too because equicontinuity fails under these assumptions. Consider $f_n(x)=x^n$ on $[0,1]$ and extended to be $0$ on the left and $1$ on the right.






          share|cite|improve this answer




























            up vote
            0
            down vote













            Let $F_n$ be normal with mean $0$ and variance $n$. Then there is no convergent subsequence.






            share|cite|improve this answer





















              Your Answer




              StackExchange.ifUsing("editor", function ()
              return StackExchange.using("mathjaxEditing", function ()
              StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
              StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
              );
              );
              , "mathjax-editing");

              StackExchange.ready(function()
              var channelOptions =
              tags: "".split(" "),
              id: "69"
              ;
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function()
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled)
              StackExchange.using("snippets", function()
              createEditor();
              );

              else
              createEditor();

              );

              function createEditor()
              StackExchange.prepareEditor(
              heartbeatType: 'answer',
              convertImagesToLinks: true,
              noModals: false,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: 10,
              bindNavPrevention: true,
              postfix: "",
              noCode: true, onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              );



              );








               

              draft saved


              draft discarded


















              StackExchange.ready(
              function ()
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2864983%2fapplying-ascolis-theorem-to-the-space-of-continuous-cumulative-distribution-fun%23new-answer', 'question_page');

              );

              Post as a guest






























              2 Answers
              2






              active

              oldest

              votes








              2 Answers
              2






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes








              up vote
              1
              down vote













              No. First Arzela's theorem requires the compactness; localizing as you suggest results in uniform convergence on compact subsets which is not really what you want. More to the point, the local result fails too because equicontinuity fails under these assumptions. Consider $f_n(x)=x^n$ on $[0,1]$ and extended to be $0$ on the left and $1$ on the right.






              share|cite|improve this answer

























                up vote
                1
                down vote













                No. First Arzela's theorem requires the compactness; localizing as you suggest results in uniform convergence on compact subsets which is not really what you want. More to the point, the local result fails too because equicontinuity fails under these assumptions. Consider $f_n(x)=x^n$ on $[0,1]$ and extended to be $0$ on the left and $1$ on the right.






                share|cite|improve this answer























                  up vote
                  1
                  down vote










                  up vote
                  1
                  down vote









                  No. First Arzela's theorem requires the compactness; localizing as you suggest results in uniform convergence on compact subsets which is not really what you want. More to the point, the local result fails too because equicontinuity fails under these assumptions. Consider $f_n(x)=x^n$ on $[0,1]$ and extended to be $0$ on the left and $1$ on the right.






                  share|cite|improve this answer













                  No. First Arzela's theorem requires the compactness; localizing as you suggest results in uniform convergence on compact subsets which is not really what you want. More to the point, the local result fails too because equicontinuity fails under these assumptions. Consider $f_n(x)=x^n$ on $[0,1]$ and extended to be $0$ on the left and $1$ on the right.







                  share|cite|improve this answer













                  share|cite|improve this answer



                  share|cite|improve this answer











                  answered Jul 28 at 5:35









                  Ian

                  65k24681




                  65k24681




















                      up vote
                      0
                      down vote













                      Let $F_n$ be normal with mean $0$ and variance $n$. Then there is no convergent subsequence.






                      share|cite|improve this answer

























                        up vote
                        0
                        down vote













                        Let $F_n$ be normal with mean $0$ and variance $n$. Then there is no convergent subsequence.






                        share|cite|improve this answer























                          up vote
                          0
                          down vote










                          up vote
                          0
                          down vote









                          Let $F_n$ be normal with mean $0$ and variance $n$. Then there is no convergent subsequence.






                          share|cite|improve this answer













                          Let $F_n$ be normal with mean $0$ and variance $n$. Then there is no convergent subsequence.







                          share|cite|improve this answer













                          share|cite|improve this answer



                          share|cite|improve this answer











                          answered Jul 28 at 12:20









                          Kavi Rama Murthy

                          19.9k2829




                          19.9k2829






















                               

                              draft saved


                              draft discarded


























                               


                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function ()
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2864983%2fapplying-ascolis-theorem-to-the-space-of-continuous-cumulative-distribution-fun%23new-answer', 'question_page');

                              );

                              Post as a guest













































































                              Comments

                              Popular posts from this blog

                              What is the equation of a 3D cone with generalised tilt?

                              Color the edges and diagonals of a regular polygon

                              Relationship between determinant of matrix and determinant of adjoint?