Can we approximate a convex function by twice differentiable convex function?
Clash Royale CLAN TAG#URR8PPP
up vote
2
down vote
favorite
Suppose $f(x)$ is convex over some compact set $S subset mathbbR^n$. Can we construct a sequence $f_n$ of twice differentiable convex function such that:
- $f_n(x) rightarrow f(x)$ for all $x in S$.
- $f_n'(x) rightarrow f'(x)$ for all $x in S$ where $f'(x)$ exists.
- $f_n''(x) rightarrow f''(x)$ for all$x in S$ where $f''(x)$ exists.
Additionally, it would be helpful if the convergence is uniform.
Motivation: I am reading a textbook on convex optimization, and there is a variety of inequalities on convex functions proved under various assumptions (simply convex, convex+differentiable, twice differentiable, etc). I'm wondering if one can just assume everything is twice differentiable, derive the needed results, and argue they hold for non-twice-differentiable functions by a limiting argument.
real-analysis convex-analysis
add a comment |Â
up vote
2
down vote
favorite
Suppose $f(x)$ is convex over some compact set $S subset mathbbR^n$. Can we construct a sequence $f_n$ of twice differentiable convex function such that:
- $f_n(x) rightarrow f(x)$ for all $x in S$.
- $f_n'(x) rightarrow f'(x)$ for all $x in S$ where $f'(x)$ exists.
- $f_n''(x) rightarrow f''(x)$ for all$x in S$ where $f''(x)$ exists.
Additionally, it would be helpful if the convergence is uniform.
Motivation: I am reading a textbook on convex optimization, and there is a variety of inequalities on convex functions proved under various assumptions (simply convex, convex+differentiable, twice differentiable, etc). I'm wondering if one can just assume everything is twice differentiable, derive the needed results, and argue they hold for non-twice-differentiable functions by a limiting argument.
real-analysis convex-analysis
1
In optimization you often need to have more than pointwise convergence of functions to show that maximisers converge somewhere. Usually uniform convergence of the functions is needed.
â Marc
Jul 31 at 13:43
This might be possible in some cases, perhaps, but certainly not universally. I don't see you proving the global convergence of subgradient methods, for instance, using this techniqueâÂÂbecause your proof will completely fail to address situations where the algorithm starts at a point of nondifferentiability.
â Michael Grant
Jul 31 at 13:50
@Marc -- good point, it would be helpful for these to converge uniformly.
â Michael S.
Jul 31 at 14:04
add a comment |Â
up vote
2
down vote
favorite
up vote
2
down vote
favorite
Suppose $f(x)$ is convex over some compact set $S subset mathbbR^n$. Can we construct a sequence $f_n$ of twice differentiable convex function such that:
- $f_n(x) rightarrow f(x)$ for all $x in S$.
- $f_n'(x) rightarrow f'(x)$ for all $x in S$ where $f'(x)$ exists.
- $f_n''(x) rightarrow f''(x)$ for all$x in S$ where $f''(x)$ exists.
Additionally, it would be helpful if the convergence is uniform.
Motivation: I am reading a textbook on convex optimization, and there is a variety of inequalities on convex functions proved under various assumptions (simply convex, convex+differentiable, twice differentiable, etc). I'm wondering if one can just assume everything is twice differentiable, derive the needed results, and argue they hold for non-twice-differentiable functions by a limiting argument.
real-analysis convex-analysis
Suppose $f(x)$ is convex over some compact set $S subset mathbbR^n$. Can we construct a sequence $f_n$ of twice differentiable convex function such that:
- $f_n(x) rightarrow f(x)$ for all $x in S$.
- $f_n'(x) rightarrow f'(x)$ for all $x in S$ where $f'(x)$ exists.
- $f_n''(x) rightarrow f''(x)$ for all$x in S$ where $f''(x)$ exists.
Additionally, it would be helpful if the convergence is uniform.
Motivation: I am reading a textbook on convex optimization, and there is a variety of inequalities on convex functions proved under various assumptions (simply convex, convex+differentiable, twice differentiable, etc). I'm wondering if one can just assume everything is twice differentiable, derive the needed results, and argue they hold for non-twice-differentiable functions by a limiting argument.
real-analysis convex-analysis
edited Jul 31 at 14:05
asked Jul 31 at 13:40
Michael S.
745
745
1
In optimization you often need to have more than pointwise convergence of functions to show that maximisers converge somewhere. Usually uniform convergence of the functions is needed.
â Marc
Jul 31 at 13:43
This might be possible in some cases, perhaps, but certainly not universally. I don't see you proving the global convergence of subgradient methods, for instance, using this techniqueâÂÂbecause your proof will completely fail to address situations where the algorithm starts at a point of nondifferentiability.
â Michael Grant
Jul 31 at 13:50
@Marc -- good point, it would be helpful for these to converge uniformly.
â Michael S.
Jul 31 at 14:04
add a comment |Â
1
In optimization you often need to have more than pointwise convergence of functions to show that maximisers converge somewhere. Usually uniform convergence of the functions is needed.
â Marc
Jul 31 at 13:43
This might be possible in some cases, perhaps, but certainly not universally. I don't see you proving the global convergence of subgradient methods, for instance, using this techniqueâÂÂbecause your proof will completely fail to address situations where the algorithm starts at a point of nondifferentiability.
â Michael Grant
Jul 31 at 13:50
@Marc -- good point, it would be helpful for these to converge uniformly.
â Michael S.
Jul 31 at 14:04
1
1
In optimization you often need to have more than pointwise convergence of functions to show that maximisers converge somewhere. Usually uniform convergence of the functions is needed.
â Marc
Jul 31 at 13:43
In optimization you often need to have more than pointwise convergence of functions to show that maximisers converge somewhere. Usually uniform convergence of the functions is needed.
â Marc
Jul 31 at 13:43
This might be possible in some cases, perhaps, but certainly not universally. I don't see you proving the global convergence of subgradient methods, for instance, using this techniqueâÂÂbecause your proof will completely fail to address situations where the algorithm starts at a point of nondifferentiability.
â Michael Grant
Jul 31 at 13:50
This might be possible in some cases, perhaps, but certainly not universally. I don't see you proving the global convergence of subgradient methods, for instance, using this techniqueâÂÂbecause your proof will completely fail to address situations where the algorithm starts at a point of nondifferentiability.
â Michael Grant
Jul 31 at 13:50
@Marc -- good point, it would be helpful for these to converge uniformly.
â Michael S.
Jul 31 at 14:04
@Marc -- good point, it would be helpful for these to converge uniformly.
â Michael S.
Jul 31 at 14:04
add a comment |Â
2 Answers
2
active
oldest
votes
up vote
1
down vote
Uniform convergence will not be possible, I think. Roughly speaking, suppose that $f'(x)$ has a discontinuity, but $f''(x)$ is bounded at all points except this discontinuity. The functions $f_n'(x)$ will have to be "nearly discontinuous", converging to one value above the discontinuity and another below it. In other words, the functions $f'_n(x)$ will have to change rapidly in a small neighborhood of the discontinuity. But this will lead to $f''_n(x)$ being quite large in that same neighborhood, and so $f''_n(x)$ will "have trouble" converging uniformly to a finite function $f''(x)$.
Here's my attempt at a proof. It's probably not as rigorous as it should be, and I welcome any critiques.
Consider the function $f(x) = |x|$ from $[-1,1] to [0,1]$. Suppose that a family of functions $f_n(x)$ exists such that $f'_n(x)$ and $f''_n(x)$ converge absolutely to $f'(x)$ and $f''(x)$ where they are defined (i.e., on $[-1,0) cup (0,1]$.
Fix two positive numbers $0<a<1$ and $0 < epsilon < 1/(a+1)$. We have that $f''(x) = 0$, where it exists. Uniform convergence then implies that there exists an $N$ such that for $n > N$, $|f_n''(x)| < epsilon$ for $x neq 0$. This then implies that
$$
|f'_n(a) - f'_n(-a)| = left| int_-a^a f''_n(x) dx right| leq int_-a^a |f''_n(x)| dx < 2 a epsilon
$$
for all $n > N$.
But if $f'_n(x)$ is to converge uniformly to $f'(x)$ on $[-1,0) cup (0,1]$, then for any $epsilon$ there must also exist an $M$ such that for all $n > M$, we have $|f'_n(x) - f'(x)| < epsilon$. For any $a> 0$ we have $f'(a) = 1$ and $f'(-a)= -1$. This means, in particular, that for $n > M$ we must have $|f'_n(a) - 1| < epsilon$ and $|f_n(-a) + 1| < epsilon$; together, these imply that
$$
|f'_n(a) - f'_n(-a)| > 2 - 2 epsilon.
$$
Thus, for any $n > max(N,M)$, we must have
$$
2 a epsilon > |f'_n(a) - f'_n(-a)| > 2 - 2 epsilon
$$
which implies that $epsilon > 1/(a+1)$. But $epsilon$ was chosen such that $epsilon < 1/(a+1)$, and so we have a contradiction.
add a comment |Â
up vote
0
down vote
Surely we can! Let $$f_n(x)=f(x)-dfracx+1n$$for example.
add a comment |Â
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
Uniform convergence will not be possible, I think. Roughly speaking, suppose that $f'(x)$ has a discontinuity, but $f''(x)$ is bounded at all points except this discontinuity. The functions $f_n'(x)$ will have to be "nearly discontinuous", converging to one value above the discontinuity and another below it. In other words, the functions $f'_n(x)$ will have to change rapidly in a small neighborhood of the discontinuity. But this will lead to $f''_n(x)$ being quite large in that same neighborhood, and so $f''_n(x)$ will "have trouble" converging uniformly to a finite function $f''(x)$.
Here's my attempt at a proof. It's probably not as rigorous as it should be, and I welcome any critiques.
Consider the function $f(x) = |x|$ from $[-1,1] to [0,1]$. Suppose that a family of functions $f_n(x)$ exists such that $f'_n(x)$ and $f''_n(x)$ converge absolutely to $f'(x)$ and $f''(x)$ where they are defined (i.e., on $[-1,0) cup (0,1]$.
Fix two positive numbers $0<a<1$ and $0 < epsilon < 1/(a+1)$. We have that $f''(x) = 0$, where it exists. Uniform convergence then implies that there exists an $N$ such that for $n > N$, $|f_n''(x)| < epsilon$ for $x neq 0$. This then implies that
$$
|f'_n(a) - f'_n(-a)| = left| int_-a^a f''_n(x) dx right| leq int_-a^a |f''_n(x)| dx < 2 a epsilon
$$
for all $n > N$.
But if $f'_n(x)$ is to converge uniformly to $f'(x)$ on $[-1,0) cup (0,1]$, then for any $epsilon$ there must also exist an $M$ such that for all $n > M$, we have $|f'_n(x) - f'(x)| < epsilon$. For any $a> 0$ we have $f'(a) = 1$ and $f'(-a)= -1$. This means, in particular, that for $n > M$ we must have $|f'_n(a) - 1| < epsilon$ and $|f_n(-a) + 1| < epsilon$; together, these imply that
$$
|f'_n(a) - f'_n(-a)| > 2 - 2 epsilon.
$$
Thus, for any $n > max(N,M)$, we must have
$$
2 a epsilon > |f'_n(a) - f'_n(-a)| > 2 - 2 epsilon
$$
which implies that $epsilon > 1/(a+1)$. But $epsilon$ was chosen such that $epsilon < 1/(a+1)$, and so we have a contradiction.
add a comment |Â
up vote
1
down vote
Uniform convergence will not be possible, I think. Roughly speaking, suppose that $f'(x)$ has a discontinuity, but $f''(x)$ is bounded at all points except this discontinuity. The functions $f_n'(x)$ will have to be "nearly discontinuous", converging to one value above the discontinuity and another below it. In other words, the functions $f'_n(x)$ will have to change rapidly in a small neighborhood of the discontinuity. But this will lead to $f''_n(x)$ being quite large in that same neighborhood, and so $f''_n(x)$ will "have trouble" converging uniformly to a finite function $f''(x)$.
Here's my attempt at a proof. It's probably not as rigorous as it should be, and I welcome any critiques.
Consider the function $f(x) = |x|$ from $[-1,1] to [0,1]$. Suppose that a family of functions $f_n(x)$ exists such that $f'_n(x)$ and $f''_n(x)$ converge absolutely to $f'(x)$ and $f''(x)$ where they are defined (i.e., on $[-1,0) cup (0,1]$.
Fix two positive numbers $0<a<1$ and $0 < epsilon < 1/(a+1)$. We have that $f''(x) = 0$, where it exists. Uniform convergence then implies that there exists an $N$ such that for $n > N$, $|f_n''(x)| < epsilon$ for $x neq 0$. This then implies that
$$
|f'_n(a) - f'_n(-a)| = left| int_-a^a f''_n(x) dx right| leq int_-a^a |f''_n(x)| dx < 2 a epsilon
$$
for all $n > N$.
But if $f'_n(x)$ is to converge uniformly to $f'(x)$ on $[-1,0) cup (0,1]$, then for any $epsilon$ there must also exist an $M$ such that for all $n > M$, we have $|f'_n(x) - f'(x)| < epsilon$. For any $a> 0$ we have $f'(a) = 1$ and $f'(-a)= -1$. This means, in particular, that for $n > M$ we must have $|f'_n(a) - 1| < epsilon$ and $|f_n(-a) + 1| < epsilon$; together, these imply that
$$
|f'_n(a) - f'_n(-a)| > 2 - 2 epsilon.
$$
Thus, for any $n > max(N,M)$, we must have
$$
2 a epsilon > |f'_n(a) - f'_n(-a)| > 2 - 2 epsilon
$$
which implies that $epsilon > 1/(a+1)$. But $epsilon$ was chosen such that $epsilon < 1/(a+1)$, and so we have a contradiction.
add a comment |Â
up vote
1
down vote
up vote
1
down vote
Uniform convergence will not be possible, I think. Roughly speaking, suppose that $f'(x)$ has a discontinuity, but $f''(x)$ is bounded at all points except this discontinuity. The functions $f_n'(x)$ will have to be "nearly discontinuous", converging to one value above the discontinuity and another below it. In other words, the functions $f'_n(x)$ will have to change rapidly in a small neighborhood of the discontinuity. But this will lead to $f''_n(x)$ being quite large in that same neighborhood, and so $f''_n(x)$ will "have trouble" converging uniformly to a finite function $f''(x)$.
Here's my attempt at a proof. It's probably not as rigorous as it should be, and I welcome any critiques.
Consider the function $f(x) = |x|$ from $[-1,1] to [0,1]$. Suppose that a family of functions $f_n(x)$ exists such that $f'_n(x)$ and $f''_n(x)$ converge absolutely to $f'(x)$ and $f''(x)$ where they are defined (i.e., on $[-1,0) cup (0,1]$.
Fix two positive numbers $0<a<1$ and $0 < epsilon < 1/(a+1)$. We have that $f''(x) = 0$, where it exists. Uniform convergence then implies that there exists an $N$ such that for $n > N$, $|f_n''(x)| < epsilon$ for $x neq 0$. This then implies that
$$
|f'_n(a) - f'_n(-a)| = left| int_-a^a f''_n(x) dx right| leq int_-a^a |f''_n(x)| dx < 2 a epsilon
$$
for all $n > N$.
But if $f'_n(x)$ is to converge uniformly to $f'(x)$ on $[-1,0) cup (0,1]$, then for any $epsilon$ there must also exist an $M$ such that for all $n > M$, we have $|f'_n(x) - f'(x)| < epsilon$. For any $a> 0$ we have $f'(a) = 1$ and $f'(-a)= -1$. This means, in particular, that for $n > M$ we must have $|f'_n(a) - 1| < epsilon$ and $|f_n(-a) + 1| < epsilon$; together, these imply that
$$
|f'_n(a) - f'_n(-a)| > 2 - 2 epsilon.
$$
Thus, for any $n > max(N,M)$, we must have
$$
2 a epsilon > |f'_n(a) - f'_n(-a)| > 2 - 2 epsilon
$$
which implies that $epsilon > 1/(a+1)$. But $epsilon$ was chosen such that $epsilon < 1/(a+1)$, and so we have a contradiction.
Uniform convergence will not be possible, I think. Roughly speaking, suppose that $f'(x)$ has a discontinuity, but $f''(x)$ is bounded at all points except this discontinuity. The functions $f_n'(x)$ will have to be "nearly discontinuous", converging to one value above the discontinuity and another below it. In other words, the functions $f'_n(x)$ will have to change rapidly in a small neighborhood of the discontinuity. But this will lead to $f''_n(x)$ being quite large in that same neighborhood, and so $f''_n(x)$ will "have trouble" converging uniformly to a finite function $f''(x)$.
Here's my attempt at a proof. It's probably not as rigorous as it should be, and I welcome any critiques.
Consider the function $f(x) = |x|$ from $[-1,1] to [0,1]$. Suppose that a family of functions $f_n(x)$ exists such that $f'_n(x)$ and $f''_n(x)$ converge absolutely to $f'(x)$ and $f''(x)$ where they are defined (i.e., on $[-1,0) cup (0,1]$.
Fix two positive numbers $0<a<1$ and $0 < epsilon < 1/(a+1)$. We have that $f''(x) = 0$, where it exists. Uniform convergence then implies that there exists an $N$ such that for $n > N$, $|f_n''(x)| < epsilon$ for $x neq 0$. This then implies that
$$
|f'_n(a) - f'_n(-a)| = left| int_-a^a f''_n(x) dx right| leq int_-a^a |f''_n(x)| dx < 2 a epsilon
$$
for all $n > N$.
But if $f'_n(x)$ is to converge uniformly to $f'(x)$ on $[-1,0) cup (0,1]$, then for any $epsilon$ there must also exist an $M$ such that for all $n > M$, we have $|f'_n(x) - f'(x)| < epsilon$. For any $a> 0$ we have $f'(a) = 1$ and $f'(-a)= -1$. This means, in particular, that for $n > M$ we must have $|f'_n(a) - 1| < epsilon$ and $|f_n(-a) + 1| < epsilon$; together, these imply that
$$
|f'_n(a) - f'_n(-a)| > 2 - 2 epsilon.
$$
Thus, for any $n > max(N,M)$, we must have
$$
2 a epsilon > |f'_n(a) - f'_n(-a)| > 2 - 2 epsilon
$$
which implies that $epsilon > 1/(a+1)$. But $epsilon$ was chosen such that $epsilon < 1/(a+1)$, and so we have a contradiction.
edited Jul 31 at 19:38
answered Jul 31 at 18:36
Michael Seifert
4,449623
4,449623
add a comment |Â
add a comment |Â
up vote
0
down vote
Surely we can! Let $$f_n(x)=f(x)-dfracx+1n$$for example.
add a comment |Â
up vote
0
down vote
Surely we can! Let $$f_n(x)=f(x)-dfracx+1n$$for example.
add a comment |Â
up vote
0
down vote
up vote
0
down vote
Surely we can! Let $$f_n(x)=f(x)-dfracx+1n$$for example.
Surely we can! Let $$f_n(x)=f(x)-dfracx+1n$$for example.
answered Jul 31 at 19:43
Mostafa Ayaz
8,4623630
8,4623630
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2868054%2fcan-we-approximate-a-convex-function-by-twice-differentiable-convex-function%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
1
In optimization you often need to have more than pointwise convergence of functions to show that maximisers converge somewhere. Usually uniform convergence of the functions is needed.
â Marc
Jul 31 at 13:43
This might be possible in some cases, perhaps, but certainly not universally. I don't see you proving the global convergence of subgradient methods, for instance, using this techniqueâÂÂbecause your proof will completely fail to address situations where the algorithm starts at a point of nondifferentiability.
â Michael Grant
Jul 31 at 13:50
@Marc -- good point, it would be helpful for these to converge uniformly.
â Michael S.
Jul 31 at 14:04