Given function $f$ find directional derivative of $lVert nabla f rVert$ in direction given by $nabla f$
Clash Royale CLAN TAG#URR8PPP
up vote
1
down vote
favorite
Suppose we have a function $f(x,y)$ differentiable as many times as you like in $mathbbR^2$ the gradient is given by
$$
nabla f (x,y) = left(f_x,f_y right)^T
$$
cosine and sine of such vector are given by
$$
left{
beginarrayl
cos alpha = fracf_xlVert nabla f rVert \
sin alpha = fracf_ylVert nabla f rVert \
endarray
right. ,
$$
I also define $u_alpha = (cos alpha, sin alpha)^T$
I want to compute
$
nabla_alpha left( lVert nabla f rVert right)
$
which should be given by
$$
nabla_alpha left( lVert nabla f rVert right) = langle nabla left( lVert nabla f rVert right) , u_alpha rangle = fracf_xx f_xlVert nabla f rVert cdot fracf_xlVert nabla f rVert + fracf_yy f_ylVert nabla f rVert cdot fracf_ylVert nabla f rVert = \
left(f_xx + f_yy right) cdot left( fracf_x^2lVert nabla f rVert^2 + fracf_y^2lVert nabla f rVert^2 right) = f_xx + f_yy = nabla^2 f
$$
The question is: is this derivation of the Laplacian operator rigorous?
The reason of my question is given by the following quote, taken from a computer vision book (the topic is edge detection):
For many applications, however, we wish to think such a continuous
gradient image to only return isolated edges, i.e., as single pixels
at discrete locations along the edge contours. This can be achieved by
looking for maxima in the edge strength (gradient magnitude) in a
direction perpendicular to the edge orientation, i.e., along the
gradient direction. Finding this maximum corresponds to taking a
directional derivative of the strength field in the direction of the
gradient and then looking for zero crossing. The desired directional
derivative is equivalent to the dot product between a second gradient
operator and the result of the first... The gradient dot product with
the gradient is called the Laplacian.
Thank you.
calculus proof-verification vector-analysis laplacian image-processing
 |Â
show 1 more comment
up vote
1
down vote
favorite
Suppose we have a function $f(x,y)$ differentiable as many times as you like in $mathbbR^2$ the gradient is given by
$$
nabla f (x,y) = left(f_x,f_y right)^T
$$
cosine and sine of such vector are given by
$$
left{
beginarrayl
cos alpha = fracf_xlVert nabla f rVert \
sin alpha = fracf_ylVert nabla f rVert \
endarray
right. ,
$$
I also define $u_alpha = (cos alpha, sin alpha)^T$
I want to compute
$
nabla_alpha left( lVert nabla f rVert right)
$
which should be given by
$$
nabla_alpha left( lVert nabla f rVert right) = langle nabla left( lVert nabla f rVert right) , u_alpha rangle = fracf_xx f_xlVert nabla f rVert cdot fracf_xlVert nabla f rVert + fracf_yy f_ylVert nabla f rVert cdot fracf_ylVert nabla f rVert = \
left(f_xx + f_yy right) cdot left( fracf_x^2lVert nabla f rVert^2 + fracf_y^2lVert nabla f rVert^2 right) = f_xx + f_yy = nabla^2 f
$$
The question is: is this derivation of the Laplacian operator rigorous?
The reason of my question is given by the following quote, taken from a computer vision book (the topic is edge detection):
For many applications, however, we wish to think such a continuous
gradient image to only return isolated edges, i.e., as single pixels
at discrete locations along the edge contours. This can be achieved by
looking for maxima in the edge strength (gradient magnitude) in a
direction perpendicular to the edge orientation, i.e., along the
gradient direction. Finding this maximum corresponds to taking a
directional derivative of the strength field in the direction of the
gradient and then looking for zero crossing. The desired directional
derivative is equivalent to the dot product between a second gradient
operator and the result of the first... The gradient dot product with
the gradient is called the Laplacian.
Thank you.
calculus proof-verification vector-analysis laplacian image-processing
Can you show your computation of $langle nabla (lVert nabla f rVert ), u_alpha rangle$?
– Gibbs
Jul 24 at 8:51
I'm afraid it's wrong: The gradient of $|nabla f|$ also contains mixed partials.
– Christian Blatter
Jul 24 at 8:52
@Gibbs it's there already, I'll add an edit explaining why I'm doing this.
– user8469759
Jul 24 at 8:59
@user8469759 I did the same computation without getting your result. It would be good to see your steps.
– Gibbs
Jul 24 at 9:10
1
@Gibbs The computation is wrong, as pointed out by Christian Blatter. For example $partial_x lVert nabla f rVert = fracf_xx + f_xylVert nabla f rVert$, I forgot the mixed term. I'll should probably ask a different question why I can't get the result of the quote.
– user8469759
Jul 24 at 9:16
 |Â
show 1 more comment
up vote
1
down vote
favorite
up vote
1
down vote
favorite
Suppose we have a function $f(x,y)$ differentiable as many times as you like in $mathbbR^2$ the gradient is given by
$$
nabla f (x,y) = left(f_x,f_y right)^T
$$
cosine and sine of such vector are given by
$$
left{
beginarrayl
cos alpha = fracf_xlVert nabla f rVert \
sin alpha = fracf_ylVert nabla f rVert \
endarray
right. ,
$$
I also define $u_alpha = (cos alpha, sin alpha)^T$
I want to compute
$
nabla_alpha left( lVert nabla f rVert right)
$
which should be given by
$$
nabla_alpha left( lVert nabla f rVert right) = langle nabla left( lVert nabla f rVert right) , u_alpha rangle = fracf_xx f_xlVert nabla f rVert cdot fracf_xlVert nabla f rVert + fracf_yy f_ylVert nabla f rVert cdot fracf_ylVert nabla f rVert = \
left(f_xx + f_yy right) cdot left( fracf_x^2lVert nabla f rVert^2 + fracf_y^2lVert nabla f rVert^2 right) = f_xx + f_yy = nabla^2 f
$$
The question is: is this derivation of the Laplacian operator rigorous?
The reason of my question is given by the following quote, taken from a computer vision book (the topic is edge detection):
For many applications, however, we wish to think such a continuous
gradient image to only return isolated edges, i.e., as single pixels
at discrete locations along the edge contours. This can be achieved by
looking for maxima in the edge strength (gradient magnitude) in a
direction perpendicular to the edge orientation, i.e., along the
gradient direction. Finding this maximum corresponds to taking a
directional derivative of the strength field in the direction of the
gradient and then looking for zero crossing. The desired directional
derivative is equivalent to the dot product between a second gradient
operator and the result of the first... The gradient dot product with
the gradient is called the Laplacian.
Thank you.
calculus proof-verification vector-analysis laplacian image-processing
Suppose we have a function $f(x,y)$ differentiable as many times as you like in $mathbbR^2$ the gradient is given by
$$
nabla f (x,y) = left(f_x,f_y right)^T
$$
cosine and sine of such vector are given by
$$
left{
beginarrayl
cos alpha = fracf_xlVert nabla f rVert \
sin alpha = fracf_ylVert nabla f rVert \
endarray
right. ,
$$
I also define $u_alpha = (cos alpha, sin alpha)^T$
I want to compute
$
nabla_alpha left( lVert nabla f rVert right)
$
which should be given by
$$
nabla_alpha left( lVert nabla f rVert right) = langle nabla left( lVert nabla f rVert right) , u_alpha rangle = fracf_xx f_xlVert nabla f rVert cdot fracf_xlVert nabla f rVert + fracf_yy f_ylVert nabla f rVert cdot fracf_ylVert nabla f rVert = \
left(f_xx + f_yy right) cdot left( fracf_x^2lVert nabla f rVert^2 + fracf_y^2lVert nabla f rVert^2 right) = f_xx + f_yy = nabla^2 f
$$
The question is: is this derivation of the Laplacian operator rigorous?
The reason of my question is given by the following quote, taken from a computer vision book (the topic is edge detection):
For many applications, however, we wish to think such a continuous
gradient image to only return isolated edges, i.e., as single pixels
at discrete locations along the edge contours. This can be achieved by
looking for maxima in the edge strength (gradient magnitude) in a
direction perpendicular to the edge orientation, i.e., along the
gradient direction. Finding this maximum corresponds to taking a
directional derivative of the strength field in the direction of the
gradient and then looking for zero crossing. The desired directional
derivative is equivalent to the dot product between a second gradient
operator and the result of the first... The gradient dot product with
the gradient is called the Laplacian.
Thank you.
calculus proof-verification vector-analysis laplacian image-processing
edited Jul 24 at 9:07
asked Jul 24 at 8:38
user8469759
1,4291513
1,4291513
Can you show your computation of $langle nabla (lVert nabla f rVert ), u_alpha rangle$?
– Gibbs
Jul 24 at 8:51
I'm afraid it's wrong: The gradient of $|nabla f|$ also contains mixed partials.
– Christian Blatter
Jul 24 at 8:52
@Gibbs it's there already, I'll add an edit explaining why I'm doing this.
– user8469759
Jul 24 at 8:59
@user8469759 I did the same computation without getting your result. It would be good to see your steps.
– Gibbs
Jul 24 at 9:10
1
@Gibbs The computation is wrong, as pointed out by Christian Blatter. For example $partial_x lVert nabla f rVert = fracf_xx + f_xylVert nabla f rVert$, I forgot the mixed term. I'll should probably ask a different question why I can't get the result of the quote.
– user8469759
Jul 24 at 9:16
 |Â
show 1 more comment
Can you show your computation of $langle nabla (lVert nabla f rVert ), u_alpha rangle$?
– Gibbs
Jul 24 at 8:51
I'm afraid it's wrong: The gradient of $|nabla f|$ also contains mixed partials.
– Christian Blatter
Jul 24 at 8:52
@Gibbs it's there already, I'll add an edit explaining why I'm doing this.
– user8469759
Jul 24 at 8:59
@user8469759 I did the same computation without getting your result. It would be good to see your steps.
– Gibbs
Jul 24 at 9:10
1
@Gibbs The computation is wrong, as pointed out by Christian Blatter. For example $partial_x lVert nabla f rVert = fracf_xx + f_xylVert nabla f rVert$, I forgot the mixed term. I'll should probably ask a different question why I can't get the result of the quote.
– user8469759
Jul 24 at 9:16
Can you show your computation of $langle nabla (lVert nabla f rVert ), u_alpha rangle$?
– Gibbs
Jul 24 at 8:51
Can you show your computation of $langle nabla (lVert nabla f rVert ), u_alpha rangle$?
– Gibbs
Jul 24 at 8:51
I'm afraid it's wrong: The gradient of $|nabla f|$ also contains mixed partials.
– Christian Blatter
Jul 24 at 8:52
I'm afraid it's wrong: The gradient of $|nabla f|$ also contains mixed partials.
– Christian Blatter
Jul 24 at 8:52
@Gibbs it's there already, I'll add an edit explaining why I'm doing this.
– user8469759
Jul 24 at 8:59
@Gibbs it's there already, I'll add an edit explaining why I'm doing this.
– user8469759
Jul 24 at 8:59
@user8469759 I did the same computation without getting your result. It would be good to see your steps.
– Gibbs
Jul 24 at 9:10
@user8469759 I did the same computation without getting your result. It would be good to see your steps.
– Gibbs
Jul 24 at 9:10
1
1
@Gibbs The computation is wrong, as pointed out by Christian Blatter. For example $partial_x lVert nabla f rVert = fracf_xx + f_xylVert nabla f rVert$, I forgot the mixed term. I'll should probably ask a different question why I can't get the result of the quote.
– user8469759
Jul 24 at 9:16
@Gibbs The computation is wrong, as pointed out by Christian Blatter. For example $partial_x lVert nabla f rVert = fracf_xx + f_xylVert nabla f rVert$, I forgot the mixed term. I'll should probably ask a different question why I can't get the result of the quote.
– user8469759
Jul 24 at 9:16
 |Â
show 1 more comment
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2861118%2fgiven-function-f-find-directional-derivative-of-lvert-nabla-f-rvert-in-di%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Can you show your computation of $langle nabla (lVert nabla f rVert ), u_alpha rangle$?
– Gibbs
Jul 24 at 8:51
I'm afraid it's wrong: The gradient of $|nabla f|$ also contains mixed partials.
– Christian Blatter
Jul 24 at 8:52
@Gibbs it's there already, I'll add an edit explaining why I'm doing this.
– user8469759
Jul 24 at 8:59
@user8469759 I did the same computation without getting your result. It would be good to see your steps.
– Gibbs
Jul 24 at 9:10
1
@Gibbs The computation is wrong, as pointed out by Christian Blatter. For example $partial_x lVert nabla f rVert = fracf_xx + f_xylVert nabla f rVert$, I forgot the mixed term. I'll should probably ask a different question why I can't get the result of the quote.
– user8469759
Jul 24 at 9:16