Can we approximate a convex function by twice differentiable convex function?

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP











up vote
2
down vote

favorite












Suppose $f(x)$ is convex over some compact set $S subset mathbbR^n$. Can we construct a sequence $f_n$ of twice differentiable convex function such that:



  1. $f_n(x) rightarrow f(x)$ for all $x in S$.

  2. $f_n'(x) rightarrow f'(x)$ for all $x in S$ where $f'(x)$ exists.

  3. $f_n''(x) rightarrow f''(x)$ for all$x in S$ where $f''(x)$ exists.

Additionally, it would be helpful if the convergence is uniform.



Motivation: I am reading a textbook on convex optimization, and there is a variety of inequalities on convex functions proved under various assumptions (simply convex, convex+differentiable, twice differentiable, etc). I'm wondering if one can just assume everything is twice differentiable, derive the needed results, and argue they hold for non-twice-differentiable functions by a limiting argument.







share|cite|improve this question

















  • 1




    In optimization you often need to have more than pointwise convergence of functions to show that maximisers converge somewhere. Usually uniform convergence of the functions is needed.
    – Marc
    Jul 31 at 13:43










  • This might be possible in some cases, perhaps, but certainly not universally. I don't see you proving the global convergence of subgradient methods, for instance, using this technique—because your proof will completely fail to address situations where the algorithm starts at a point of nondifferentiability.
    – Michael Grant
    Jul 31 at 13:50










  • @Marc -- good point, it would be helpful for these to converge uniformly.
    – Michael S.
    Jul 31 at 14:04














up vote
2
down vote

favorite












Suppose $f(x)$ is convex over some compact set $S subset mathbbR^n$. Can we construct a sequence $f_n$ of twice differentiable convex function such that:



  1. $f_n(x) rightarrow f(x)$ for all $x in S$.

  2. $f_n'(x) rightarrow f'(x)$ for all $x in S$ where $f'(x)$ exists.

  3. $f_n''(x) rightarrow f''(x)$ for all$x in S$ where $f''(x)$ exists.

Additionally, it would be helpful if the convergence is uniform.



Motivation: I am reading a textbook on convex optimization, and there is a variety of inequalities on convex functions proved under various assumptions (simply convex, convex+differentiable, twice differentiable, etc). I'm wondering if one can just assume everything is twice differentiable, derive the needed results, and argue they hold for non-twice-differentiable functions by a limiting argument.







share|cite|improve this question

















  • 1




    In optimization you often need to have more than pointwise convergence of functions to show that maximisers converge somewhere. Usually uniform convergence of the functions is needed.
    – Marc
    Jul 31 at 13:43










  • This might be possible in some cases, perhaps, but certainly not universally. I don't see you proving the global convergence of subgradient methods, for instance, using this technique—because your proof will completely fail to address situations where the algorithm starts at a point of nondifferentiability.
    – Michael Grant
    Jul 31 at 13:50










  • @Marc -- good point, it would be helpful for these to converge uniformly.
    – Michael S.
    Jul 31 at 14:04












up vote
2
down vote

favorite









up vote
2
down vote

favorite











Suppose $f(x)$ is convex over some compact set $S subset mathbbR^n$. Can we construct a sequence $f_n$ of twice differentiable convex function such that:



  1. $f_n(x) rightarrow f(x)$ for all $x in S$.

  2. $f_n'(x) rightarrow f'(x)$ for all $x in S$ where $f'(x)$ exists.

  3. $f_n''(x) rightarrow f''(x)$ for all$x in S$ where $f''(x)$ exists.

Additionally, it would be helpful if the convergence is uniform.



Motivation: I am reading a textbook on convex optimization, and there is a variety of inequalities on convex functions proved under various assumptions (simply convex, convex+differentiable, twice differentiable, etc). I'm wondering if one can just assume everything is twice differentiable, derive the needed results, and argue they hold for non-twice-differentiable functions by a limiting argument.







share|cite|improve this question













Suppose $f(x)$ is convex over some compact set $S subset mathbbR^n$. Can we construct a sequence $f_n$ of twice differentiable convex function such that:



  1. $f_n(x) rightarrow f(x)$ for all $x in S$.

  2. $f_n'(x) rightarrow f'(x)$ for all $x in S$ where $f'(x)$ exists.

  3. $f_n''(x) rightarrow f''(x)$ for all$x in S$ where $f''(x)$ exists.

Additionally, it would be helpful if the convergence is uniform.



Motivation: I am reading a textbook on convex optimization, and there is a variety of inequalities on convex functions proved under various assumptions (simply convex, convex+differentiable, twice differentiable, etc). I'm wondering if one can just assume everything is twice differentiable, derive the needed results, and argue they hold for non-twice-differentiable functions by a limiting argument.









share|cite|improve this question












share|cite|improve this question




share|cite|improve this question








edited Jul 31 at 14:05
























asked Jul 31 at 13:40









Michael S.

745




745







  • 1




    In optimization you often need to have more than pointwise convergence of functions to show that maximisers converge somewhere. Usually uniform convergence of the functions is needed.
    – Marc
    Jul 31 at 13:43










  • This might be possible in some cases, perhaps, but certainly not universally. I don't see you proving the global convergence of subgradient methods, for instance, using this technique—because your proof will completely fail to address situations where the algorithm starts at a point of nondifferentiability.
    – Michael Grant
    Jul 31 at 13:50










  • @Marc -- good point, it would be helpful for these to converge uniformly.
    – Michael S.
    Jul 31 at 14:04












  • 1




    In optimization you often need to have more than pointwise convergence of functions to show that maximisers converge somewhere. Usually uniform convergence of the functions is needed.
    – Marc
    Jul 31 at 13:43










  • This might be possible in some cases, perhaps, but certainly not universally. I don't see you proving the global convergence of subgradient methods, for instance, using this technique—because your proof will completely fail to address situations where the algorithm starts at a point of nondifferentiability.
    – Michael Grant
    Jul 31 at 13:50










  • @Marc -- good point, it would be helpful for these to converge uniformly.
    – Michael S.
    Jul 31 at 14:04







1




1




In optimization you often need to have more than pointwise convergence of functions to show that maximisers converge somewhere. Usually uniform convergence of the functions is needed.
– Marc
Jul 31 at 13:43




In optimization you often need to have more than pointwise convergence of functions to show that maximisers converge somewhere. Usually uniform convergence of the functions is needed.
– Marc
Jul 31 at 13:43












This might be possible in some cases, perhaps, but certainly not universally. I don't see you proving the global convergence of subgradient methods, for instance, using this technique—because your proof will completely fail to address situations where the algorithm starts at a point of nondifferentiability.
– Michael Grant
Jul 31 at 13:50




This might be possible in some cases, perhaps, but certainly not universally. I don't see you proving the global convergence of subgradient methods, for instance, using this technique—because your proof will completely fail to address situations where the algorithm starts at a point of nondifferentiability.
– Michael Grant
Jul 31 at 13:50












@Marc -- good point, it would be helpful for these to converge uniformly.
– Michael S.
Jul 31 at 14:04




@Marc -- good point, it would be helpful for these to converge uniformly.
– Michael S.
Jul 31 at 14:04










2 Answers
2






active

oldest

votes

















up vote
1
down vote













Uniform convergence will not be possible, I think. Roughly speaking, suppose that $f'(x)$ has a discontinuity, but $f''(x)$ is bounded at all points except this discontinuity. The functions $f_n'(x)$ will have to be "nearly discontinuous", converging to one value above the discontinuity and another below it. In other words, the functions $f'_n(x)$ will have to change rapidly in a small neighborhood of the discontinuity. But this will lead to $f''_n(x)$ being quite large in that same neighborhood, and so $f''_n(x)$ will "have trouble" converging uniformly to a finite function $f''(x)$.



Here's my attempt at a proof. It's probably not as rigorous as it should be, and I welcome any critiques.




Consider the function $f(x) = |x|$ from $[-1,1] to [0,1]$. Suppose that a family of functions $f_n(x)$ exists such that $f'_n(x)$ and $f''_n(x)$ converge absolutely to $f'(x)$ and $f''(x)$ where they are defined (i.e., on $[-1,0) cup (0,1]$.



Fix two positive numbers $0<a<1$ and $0 < epsilon < 1/(a+1)$. We have that $f''(x) = 0$, where it exists. Uniform convergence then implies that there exists an $N$ such that for $n > N$, $|f_n''(x)| < epsilon$ for $x neq 0$. This then implies that

$$
|f'_n(a) - f'_n(-a)| = left| int_-a^a f''_n(x) dx right| leq int_-a^a |f''_n(x)| dx < 2 a epsilon
$$
for all $n > N$.



But if $f'_n(x)$ is to converge uniformly to $f'(x)$ on $[-1,0) cup (0,1]$, then for any $epsilon$ there must also exist an $M$ such that for all $n > M$, we have $|f'_n(x) - f'(x)| < epsilon$. For any $a> 0$ we have $f'(a) = 1$ and $f'(-a)= -1$. This means, in particular, that for $n > M$ we must have $|f'_n(a) - 1| < epsilon$ and $|f_n(-a) + 1| < epsilon$; together, these imply that
$$
|f'_n(a) - f'_n(-a)| > 2 - 2 epsilon.
$$



Thus, for any $n > max(N,M)$, we must have
$$
2 a epsilon > |f'_n(a) - f'_n(-a)| > 2 - 2 epsilon
$$
which implies that $epsilon > 1/(a+1)$. But $epsilon$ was chosen such that $epsilon < 1/(a+1)$, and so we have a contradiction.






share|cite|improve this answer






























    up vote
    0
    down vote













    Surely we can! Let $$f_n(x)=f(x)-dfracx+1n$$for example.






    share|cite|improve this answer





















      Your Answer




      StackExchange.ifUsing("editor", function ()
      return StackExchange.using("mathjaxEditing", function ()
      StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
      StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
      );
      );
      , "mathjax-editing");

      StackExchange.ready(function()
      var channelOptions =
      tags: "".split(" "),
      id: "69"
      ;
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function()
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled)
      StackExchange.using("snippets", function()
      createEditor();
      );

      else
      createEditor();

      );

      function createEditor()
      StackExchange.prepareEditor(
      heartbeatType: 'answer',
      convertImagesToLinks: true,
      noModals: false,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      );



      );








       

      draft saved


      draft discarded


















      StackExchange.ready(
      function ()
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2868054%2fcan-we-approximate-a-convex-function-by-twice-differentiable-convex-function%23new-answer', 'question_page');

      );

      Post as a guest






























      2 Answers
      2






      active

      oldest

      votes








      2 Answers
      2






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes








      up vote
      1
      down vote













      Uniform convergence will not be possible, I think. Roughly speaking, suppose that $f'(x)$ has a discontinuity, but $f''(x)$ is bounded at all points except this discontinuity. The functions $f_n'(x)$ will have to be "nearly discontinuous", converging to one value above the discontinuity and another below it. In other words, the functions $f'_n(x)$ will have to change rapidly in a small neighborhood of the discontinuity. But this will lead to $f''_n(x)$ being quite large in that same neighborhood, and so $f''_n(x)$ will "have trouble" converging uniformly to a finite function $f''(x)$.



      Here's my attempt at a proof. It's probably not as rigorous as it should be, and I welcome any critiques.




      Consider the function $f(x) = |x|$ from $[-1,1] to [0,1]$. Suppose that a family of functions $f_n(x)$ exists such that $f'_n(x)$ and $f''_n(x)$ converge absolutely to $f'(x)$ and $f''(x)$ where they are defined (i.e., on $[-1,0) cup (0,1]$.



      Fix two positive numbers $0<a<1$ and $0 < epsilon < 1/(a+1)$. We have that $f''(x) = 0$, where it exists. Uniform convergence then implies that there exists an $N$ such that for $n > N$, $|f_n''(x)| < epsilon$ for $x neq 0$. This then implies that

      $$
      |f'_n(a) - f'_n(-a)| = left| int_-a^a f''_n(x) dx right| leq int_-a^a |f''_n(x)| dx < 2 a epsilon
      $$
      for all $n > N$.



      But if $f'_n(x)$ is to converge uniformly to $f'(x)$ on $[-1,0) cup (0,1]$, then for any $epsilon$ there must also exist an $M$ such that for all $n > M$, we have $|f'_n(x) - f'(x)| < epsilon$. For any $a> 0$ we have $f'(a) = 1$ and $f'(-a)= -1$. This means, in particular, that for $n > M$ we must have $|f'_n(a) - 1| < epsilon$ and $|f_n(-a) + 1| < epsilon$; together, these imply that
      $$
      |f'_n(a) - f'_n(-a)| > 2 - 2 epsilon.
      $$



      Thus, for any $n > max(N,M)$, we must have
      $$
      2 a epsilon > |f'_n(a) - f'_n(-a)| > 2 - 2 epsilon
      $$
      which implies that $epsilon > 1/(a+1)$. But $epsilon$ was chosen such that $epsilon < 1/(a+1)$, and so we have a contradiction.






      share|cite|improve this answer



























        up vote
        1
        down vote













        Uniform convergence will not be possible, I think. Roughly speaking, suppose that $f'(x)$ has a discontinuity, but $f''(x)$ is bounded at all points except this discontinuity. The functions $f_n'(x)$ will have to be "nearly discontinuous", converging to one value above the discontinuity and another below it. In other words, the functions $f'_n(x)$ will have to change rapidly in a small neighborhood of the discontinuity. But this will lead to $f''_n(x)$ being quite large in that same neighborhood, and so $f''_n(x)$ will "have trouble" converging uniformly to a finite function $f''(x)$.



        Here's my attempt at a proof. It's probably not as rigorous as it should be, and I welcome any critiques.




        Consider the function $f(x) = |x|$ from $[-1,1] to [0,1]$. Suppose that a family of functions $f_n(x)$ exists such that $f'_n(x)$ and $f''_n(x)$ converge absolutely to $f'(x)$ and $f''(x)$ where they are defined (i.e., on $[-1,0) cup (0,1]$.



        Fix two positive numbers $0<a<1$ and $0 < epsilon < 1/(a+1)$. We have that $f''(x) = 0$, where it exists. Uniform convergence then implies that there exists an $N$ such that for $n > N$, $|f_n''(x)| < epsilon$ for $x neq 0$. This then implies that

        $$
        |f'_n(a) - f'_n(-a)| = left| int_-a^a f''_n(x) dx right| leq int_-a^a |f''_n(x)| dx < 2 a epsilon
        $$
        for all $n > N$.



        But if $f'_n(x)$ is to converge uniformly to $f'(x)$ on $[-1,0) cup (0,1]$, then for any $epsilon$ there must also exist an $M$ such that for all $n > M$, we have $|f'_n(x) - f'(x)| < epsilon$. For any $a> 0$ we have $f'(a) = 1$ and $f'(-a)= -1$. This means, in particular, that for $n > M$ we must have $|f'_n(a) - 1| < epsilon$ and $|f_n(-a) + 1| < epsilon$; together, these imply that
        $$
        |f'_n(a) - f'_n(-a)| > 2 - 2 epsilon.
        $$



        Thus, for any $n > max(N,M)$, we must have
        $$
        2 a epsilon > |f'_n(a) - f'_n(-a)| > 2 - 2 epsilon
        $$
        which implies that $epsilon > 1/(a+1)$. But $epsilon$ was chosen such that $epsilon < 1/(a+1)$, and so we have a contradiction.






        share|cite|improve this answer

























          up vote
          1
          down vote










          up vote
          1
          down vote









          Uniform convergence will not be possible, I think. Roughly speaking, suppose that $f'(x)$ has a discontinuity, but $f''(x)$ is bounded at all points except this discontinuity. The functions $f_n'(x)$ will have to be "nearly discontinuous", converging to one value above the discontinuity and another below it. In other words, the functions $f'_n(x)$ will have to change rapidly in a small neighborhood of the discontinuity. But this will lead to $f''_n(x)$ being quite large in that same neighborhood, and so $f''_n(x)$ will "have trouble" converging uniformly to a finite function $f''(x)$.



          Here's my attempt at a proof. It's probably not as rigorous as it should be, and I welcome any critiques.




          Consider the function $f(x) = |x|$ from $[-1,1] to [0,1]$. Suppose that a family of functions $f_n(x)$ exists such that $f'_n(x)$ and $f''_n(x)$ converge absolutely to $f'(x)$ and $f''(x)$ where they are defined (i.e., on $[-1,0) cup (0,1]$.



          Fix two positive numbers $0<a<1$ and $0 < epsilon < 1/(a+1)$. We have that $f''(x) = 0$, where it exists. Uniform convergence then implies that there exists an $N$ such that for $n > N$, $|f_n''(x)| < epsilon$ for $x neq 0$. This then implies that

          $$
          |f'_n(a) - f'_n(-a)| = left| int_-a^a f''_n(x) dx right| leq int_-a^a |f''_n(x)| dx < 2 a epsilon
          $$
          for all $n > N$.



          But if $f'_n(x)$ is to converge uniformly to $f'(x)$ on $[-1,0) cup (0,1]$, then for any $epsilon$ there must also exist an $M$ such that for all $n > M$, we have $|f'_n(x) - f'(x)| < epsilon$. For any $a> 0$ we have $f'(a) = 1$ and $f'(-a)= -1$. This means, in particular, that for $n > M$ we must have $|f'_n(a) - 1| < epsilon$ and $|f_n(-a) + 1| < epsilon$; together, these imply that
          $$
          |f'_n(a) - f'_n(-a)| > 2 - 2 epsilon.
          $$



          Thus, for any $n > max(N,M)$, we must have
          $$
          2 a epsilon > |f'_n(a) - f'_n(-a)| > 2 - 2 epsilon
          $$
          which implies that $epsilon > 1/(a+1)$. But $epsilon$ was chosen such that $epsilon < 1/(a+1)$, and so we have a contradiction.






          share|cite|improve this answer















          Uniform convergence will not be possible, I think. Roughly speaking, suppose that $f'(x)$ has a discontinuity, but $f''(x)$ is bounded at all points except this discontinuity. The functions $f_n'(x)$ will have to be "nearly discontinuous", converging to one value above the discontinuity and another below it. In other words, the functions $f'_n(x)$ will have to change rapidly in a small neighborhood of the discontinuity. But this will lead to $f''_n(x)$ being quite large in that same neighborhood, and so $f''_n(x)$ will "have trouble" converging uniformly to a finite function $f''(x)$.



          Here's my attempt at a proof. It's probably not as rigorous as it should be, and I welcome any critiques.




          Consider the function $f(x) = |x|$ from $[-1,1] to [0,1]$. Suppose that a family of functions $f_n(x)$ exists such that $f'_n(x)$ and $f''_n(x)$ converge absolutely to $f'(x)$ and $f''(x)$ where they are defined (i.e., on $[-1,0) cup (0,1]$.



          Fix two positive numbers $0<a<1$ and $0 < epsilon < 1/(a+1)$. We have that $f''(x) = 0$, where it exists. Uniform convergence then implies that there exists an $N$ such that for $n > N$, $|f_n''(x)| < epsilon$ for $x neq 0$. This then implies that

          $$
          |f'_n(a) - f'_n(-a)| = left| int_-a^a f''_n(x) dx right| leq int_-a^a |f''_n(x)| dx < 2 a epsilon
          $$
          for all $n > N$.



          But if $f'_n(x)$ is to converge uniformly to $f'(x)$ on $[-1,0) cup (0,1]$, then for any $epsilon$ there must also exist an $M$ such that for all $n > M$, we have $|f'_n(x) - f'(x)| < epsilon$. For any $a> 0$ we have $f'(a) = 1$ and $f'(-a)= -1$. This means, in particular, that for $n > M$ we must have $|f'_n(a) - 1| < epsilon$ and $|f_n(-a) + 1| < epsilon$; together, these imply that
          $$
          |f'_n(a) - f'_n(-a)| > 2 - 2 epsilon.
          $$



          Thus, for any $n > max(N,M)$, we must have
          $$
          2 a epsilon > |f'_n(a) - f'_n(-a)| > 2 - 2 epsilon
          $$
          which implies that $epsilon > 1/(a+1)$. But $epsilon$ was chosen such that $epsilon < 1/(a+1)$, and so we have a contradiction.







          share|cite|improve this answer















          share|cite|improve this answer



          share|cite|improve this answer








          edited Jul 31 at 19:38


























          answered Jul 31 at 18:36









          Michael Seifert

          4,449623




          4,449623




















              up vote
              0
              down vote













              Surely we can! Let $$f_n(x)=f(x)-dfracx+1n$$for example.






              share|cite|improve this answer

























                up vote
                0
                down vote













                Surely we can! Let $$f_n(x)=f(x)-dfracx+1n$$for example.






                share|cite|improve this answer























                  up vote
                  0
                  down vote










                  up vote
                  0
                  down vote









                  Surely we can! Let $$f_n(x)=f(x)-dfracx+1n$$for example.






                  share|cite|improve this answer













                  Surely we can! Let $$f_n(x)=f(x)-dfracx+1n$$for example.







                  share|cite|improve this answer













                  share|cite|improve this answer



                  share|cite|improve this answer











                  answered Jul 31 at 19:43









                  Mostafa Ayaz

                  8,4623630




                  8,4623630






















                       

                      draft saved


                      draft discarded


























                       


                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function ()
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2868054%2fcan-we-approximate-a-convex-function-by-twice-differentiable-convex-function%23new-answer', 'question_page');

                      );

                      Post as a guest













































































                      Comments

                      Popular posts from this blog

                      What is the equation of a 3D cone with generalised tilt?

                      Color the edges and diagonals of a regular polygon

                      Relationship between determinant of matrix and determinant of adjoint?