Optimization and splitting the problem by dependent/independent variables
Clash Royale CLAN TAG#URR8PPP
up vote
0
down vote
favorite
I have the following nonlinear function:
$$f_(a,b,c,d)$$
and measurements :
$$f_measured^i$$
for $i = 1, 2, 3, 4 ...$
The problem is defined as minimization of :
$$min_a,b,c,dbigg(sum_i=1^N (f_measured^i - f_(a,b,c,d))^2bigg) tag1$$
Further, I know, that I can obtain specific measurement, where the following holds:
$$a = f_(b,c,d)$$
and
$$b = f_(c,d)$$
therefore, splitting the variables into independent $c,d$ and dependent $a,b$. Then, the optimization problem can be reformulated as :
$$min_c,dbigg(sum_i=1^N (f_measured^i - f_(a,b,c,d))^2bigg) tag2$$
with
$a = f_(b,c,d)$, $b = f_(c,d)$.
My understanding:
By providing extra information (measurements), and distinguishing independent $c,d$ and dependent $a,b$ variables, the error surface of $(1)$ is simplified. There is no need to search the error surface in $a,b$ dimensions.
Since the problem is nonlinear, I will use some gradient descent algorithm.
QUESTION
By providing more information about the problem (providing more measurements), so that the variables are split into independent $c,d$ and dependent $a,b$, is the $(2)$ still valid optimization problem ?
OR IN ANOTHER WORDS, is it possible to limit the search dimensions of the error surface ?Does limiting the search dimensions (by explicit relationship of independent and dependent variables) introduce local optimums into the error surface (e.g. a saddle point becomes a valley) ?
P.S.
The error surface is smooth, locally convex around the global minimum, second order derivative is available (e.g. for Hessian)
nonlinear-optimization gradient-descent error-function
add a comment |Â
up vote
0
down vote
favorite
I have the following nonlinear function:
$$f_(a,b,c,d)$$
and measurements :
$$f_measured^i$$
for $i = 1, 2, 3, 4 ...$
The problem is defined as minimization of :
$$min_a,b,c,dbigg(sum_i=1^N (f_measured^i - f_(a,b,c,d))^2bigg) tag1$$
Further, I know, that I can obtain specific measurement, where the following holds:
$$a = f_(b,c,d)$$
and
$$b = f_(c,d)$$
therefore, splitting the variables into independent $c,d$ and dependent $a,b$. Then, the optimization problem can be reformulated as :
$$min_c,dbigg(sum_i=1^N (f_measured^i - f_(a,b,c,d))^2bigg) tag2$$
with
$a = f_(b,c,d)$, $b = f_(c,d)$.
My understanding:
By providing extra information (measurements), and distinguishing independent $c,d$ and dependent $a,b$ variables, the error surface of $(1)$ is simplified. There is no need to search the error surface in $a,b$ dimensions.
Since the problem is nonlinear, I will use some gradient descent algorithm.
QUESTION
By providing more information about the problem (providing more measurements), so that the variables are split into independent $c,d$ and dependent $a,b$, is the $(2)$ still valid optimization problem ?
OR IN ANOTHER WORDS, is it possible to limit the search dimensions of the error surface ?Does limiting the search dimensions (by explicit relationship of independent and dependent variables) introduce local optimums into the error surface (e.g. a saddle point becomes a valley) ?
P.S.
The error surface is smooth, locally convex around the global minimum, second order derivative is available (e.g. for Hessian)
nonlinear-optimization gradient-descent error-function
add a comment |Â
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I have the following nonlinear function:
$$f_(a,b,c,d)$$
and measurements :
$$f_measured^i$$
for $i = 1, 2, 3, 4 ...$
The problem is defined as minimization of :
$$min_a,b,c,dbigg(sum_i=1^N (f_measured^i - f_(a,b,c,d))^2bigg) tag1$$
Further, I know, that I can obtain specific measurement, where the following holds:
$$a = f_(b,c,d)$$
and
$$b = f_(c,d)$$
therefore, splitting the variables into independent $c,d$ and dependent $a,b$. Then, the optimization problem can be reformulated as :
$$min_c,dbigg(sum_i=1^N (f_measured^i - f_(a,b,c,d))^2bigg) tag2$$
with
$a = f_(b,c,d)$, $b = f_(c,d)$.
My understanding:
By providing extra information (measurements), and distinguishing independent $c,d$ and dependent $a,b$ variables, the error surface of $(1)$ is simplified. There is no need to search the error surface in $a,b$ dimensions.
Since the problem is nonlinear, I will use some gradient descent algorithm.
QUESTION
By providing more information about the problem (providing more measurements), so that the variables are split into independent $c,d$ and dependent $a,b$, is the $(2)$ still valid optimization problem ?
OR IN ANOTHER WORDS, is it possible to limit the search dimensions of the error surface ?Does limiting the search dimensions (by explicit relationship of independent and dependent variables) introduce local optimums into the error surface (e.g. a saddle point becomes a valley) ?
P.S.
The error surface is smooth, locally convex around the global minimum, second order derivative is available (e.g. for Hessian)
nonlinear-optimization gradient-descent error-function
I have the following nonlinear function:
$$f_(a,b,c,d)$$
and measurements :
$$f_measured^i$$
for $i = 1, 2, 3, 4 ...$
The problem is defined as minimization of :
$$min_a,b,c,dbigg(sum_i=1^N (f_measured^i - f_(a,b,c,d))^2bigg) tag1$$
Further, I know, that I can obtain specific measurement, where the following holds:
$$a = f_(b,c,d)$$
and
$$b = f_(c,d)$$
therefore, splitting the variables into independent $c,d$ and dependent $a,b$. Then, the optimization problem can be reformulated as :
$$min_c,dbigg(sum_i=1^N (f_measured^i - f_(a,b,c,d))^2bigg) tag2$$
with
$a = f_(b,c,d)$, $b = f_(c,d)$.
My understanding:
By providing extra information (measurements), and distinguishing independent $c,d$ and dependent $a,b$ variables, the error surface of $(1)$ is simplified. There is no need to search the error surface in $a,b$ dimensions.
Since the problem is nonlinear, I will use some gradient descent algorithm.
QUESTION
By providing more information about the problem (providing more measurements), so that the variables are split into independent $c,d$ and dependent $a,b$, is the $(2)$ still valid optimization problem ?
OR IN ANOTHER WORDS, is it possible to limit the search dimensions of the error surface ?Does limiting the search dimensions (by explicit relationship of independent and dependent variables) introduce local optimums into the error surface (e.g. a saddle point becomes a valley) ?
P.S.
The error surface is smooth, locally convex around the global minimum, second order derivative is available (e.g. for Hessian)
nonlinear-optimization gradient-descent error-function
asked Jul 30 at 3:25
Martin G
377
377
add a comment |Â
add a comment |Â
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2866633%2foptimization-and-splitting-the-problem-by-dependent-independent-variables%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password