Mathematical proof that white noise has zero autocorrelated
Clash Royale CLAN TAG#URR8PPP
up vote
2
down vote
favorite
I would like to prove, using the definition of autocorrelation, that Gaussian and uniform white noise have zero autocorrelation. I am working on the continuous case (but I think discrete shouldn't be too different).
Using the definition of the autocorrelation of this link shown on equation 9 (https://courses.edx.org/asset-v1:KyotoUx+009x+2T2017+type@asset+block@009x_31.pdf) we have that:
beginequation
phi_Y(t) = lim_Tto infty frac1T int_- infty^infty dtau Y_T(tau)Y_T(tau + t))
endequation
We want to prove that $phi_Y(t) = A^2delta(t)$.
Let $Y_T(t) = A xi(t)$ be white noise. If it is uniform white noise I think $xi(t) = frac1T$ for $t in (-T/2, T/2)$ and zero otherwise. If we substitute this in the definition we would have:
$$phi_Y(t) = A^2 lim_Tto infty frac1T int_- infty^infty dtau frac1T^2 $$
but this is not the $delta(t)$ function. Where am I wrong? See bottom for a better attempt, using the time representation.
I also tried with Gaussian white noise. Gaussian noise is given by: $Y_T(t) = frac1sqrt2pisigmae^-frac(t - mu)^22sigma^2$, if I include this in the definition of autocorrelation we have:
$$phi_Y(t) = lim_Tto infty frac1T sigma^2 2 pi int_- infty^infty dtau e^-frac(tau - mu)^22sigma^2 e^-frac(tau + t- mu)^22sigma^2$$
which can be re-written as
$$phi_Y(t) = lim_Tto infty frac1T sigma^2 2 pi int_- infty^infty dtau e^-frac(tau - mu)^2sigma^2 e^frac-t(t + 2tau - 2mu)^22sigma^2$$
Under some conditions I would expect the $delta(t)$ function to come up, but I don't see where.
Edit
The frequency representation of uniform white noise is a constant probability $tildexi(omega) = p in (0, 1)$. However, to obtain the time representation we must take the inverse Fourier transform.
beginequation
Y_T(tau) = frac12pi int_-infty^infty domega e^-iomega tau Ap = Ap delta(tau).
endequation
Now we substitute in the definition of $phi_Y(t)$ and we get
beginequation
phi_Y(t) = lim_Tto infty frac1T int_- infty^infty dtau Apdelta(tau)Apdelta(tau + t)).
endequation
Now, I know that $int_- infty^infty dtau Apdelta(tau)Apdelta(tau + t)) = (Ap)^2 delta(t)$. However, I am still confused by the term $lim_Ttoinftyfrac1T$ and about the $p$ being introduced (although this might just mean amplitude I guess).
probability-distributions signal-processing correlation
add a comment |Â
up vote
2
down vote
favorite
I would like to prove, using the definition of autocorrelation, that Gaussian and uniform white noise have zero autocorrelation. I am working on the continuous case (but I think discrete shouldn't be too different).
Using the definition of the autocorrelation of this link shown on equation 9 (https://courses.edx.org/asset-v1:KyotoUx+009x+2T2017+type@asset+block@009x_31.pdf) we have that:
beginequation
phi_Y(t) = lim_Tto infty frac1T int_- infty^infty dtau Y_T(tau)Y_T(tau + t))
endequation
We want to prove that $phi_Y(t) = A^2delta(t)$.
Let $Y_T(t) = A xi(t)$ be white noise. If it is uniform white noise I think $xi(t) = frac1T$ for $t in (-T/2, T/2)$ and zero otherwise. If we substitute this in the definition we would have:
$$phi_Y(t) = A^2 lim_Tto infty frac1T int_- infty^infty dtau frac1T^2 $$
but this is not the $delta(t)$ function. Where am I wrong? See bottom for a better attempt, using the time representation.
I also tried with Gaussian white noise. Gaussian noise is given by: $Y_T(t) = frac1sqrt2pisigmae^-frac(t - mu)^22sigma^2$, if I include this in the definition of autocorrelation we have:
$$phi_Y(t) = lim_Tto infty frac1T sigma^2 2 pi int_- infty^infty dtau e^-frac(tau - mu)^22sigma^2 e^-frac(tau + t- mu)^22sigma^2$$
which can be re-written as
$$phi_Y(t) = lim_Tto infty frac1T sigma^2 2 pi int_- infty^infty dtau e^-frac(tau - mu)^2sigma^2 e^frac-t(t + 2tau - 2mu)^22sigma^2$$
Under some conditions I would expect the $delta(t)$ function to come up, but I don't see where.
Edit
The frequency representation of uniform white noise is a constant probability $tildexi(omega) = p in (0, 1)$. However, to obtain the time representation we must take the inverse Fourier transform.
beginequation
Y_T(tau) = frac12pi int_-infty^infty domega e^-iomega tau Ap = Ap delta(tau).
endequation
Now we substitute in the definition of $phi_Y(t)$ and we get
beginequation
phi_Y(t) = lim_Tto infty frac1T int_- infty^infty dtau Apdelta(tau)Apdelta(tau + t)).
endequation
Now, I know that $int_- infty^infty dtau Apdelta(tau)Apdelta(tau + t)) = (Ap)^2 delta(t)$. However, I am still confused by the term $lim_Ttoinftyfrac1T$ and about the $p$ being introduced (although this might just mean amplitude I guess).
probability-distributions signal-processing correlation
1
One issue in the first approach is that you didn't change your limits of integration. Note that your definition has inherent bounds on the domain. Moreover, you're looking at the frequency representation of white noise but your integral is over time. You would need to take the Fourier transform to get the time representation. The time representation should be a sinc. As for the second, you should try actually doing the integral, then take a limit. Here's a hint: take a Fourier transform. Something might be slightly off since you're missing $T$ dependence.
– Cameron Williams
Jul 24 at 10:36
Thanks! I am a bit confused on why I should change limits of integration? Since I have not done a change of variable. Good point on the frequency/time representation, I'll work on that. For the second, I think there is an extra restriction gaussian noise has to have to be white, which probably introduces the $T$ dependence?
– RM-
Jul 24 at 10:59
Made an edit, using your hint. Fourier transform to pass to the time domain. I still think there is something off with the T.
– RM-
Jul 24 at 14:45
add a comment |Â
up vote
2
down vote
favorite
up vote
2
down vote
favorite
I would like to prove, using the definition of autocorrelation, that Gaussian and uniform white noise have zero autocorrelation. I am working on the continuous case (but I think discrete shouldn't be too different).
Using the definition of the autocorrelation of this link shown on equation 9 (https://courses.edx.org/asset-v1:KyotoUx+009x+2T2017+type@asset+block@009x_31.pdf) we have that:
beginequation
phi_Y(t) = lim_Tto infty frac1T int_- infty^infty dtau Y_T(tau)Y_T(tau + t))
endequation
We want to prove that $phi_Y(t) = A^2delta(t)$.
Let $Y_T(t) = A xi(t)$ be white noise. If it is uniform white noise I think $xi(t) = frac1T$ for $t in (-T/2, T/2)$ and zero otherwise. If we substitute this in the definition we would have:
$$phi_Y(t) = A^2 lim_Tto infty frac1T int_- infty^infty dtau frac1T^2 $$
but this is not the $delta(t)$ function. Where am I wrong? See bottom for a better attempt, using the time representation.
I also tried with Gaussian white noise. Gaussian noise is given by: $Y_T(t) = frac1sqrt2pisigmae^-frac(t - mu)^22sigma^2$, if I include this in the definition of autocorrelation we have:
$$phi_Y(t) = lim_Tto infty frac1T sigma^2 2 pi int_- infty^infty dtau e^-frac(tau - mu)^22sigma^2 e^-frac(tau + t- mu)^22sigma^2$$
which can be re-written as
$$phi_Y(t) = lim_Tto infty frac1T sigma^2 2 pi int_- infty^infty dtau e^-frac(tau - mu)^2sigma^2 e^frac-t(t + 2tau - 2mu)^22sigma^2$$
Under some conditions I would expect the $delta(t)$ function to come up, but I don't see where.
Edit
The frequency representation of uniform white noise is a constant probability $tildexi(omega) = p in (0, 1)$. However, to obtain the time representation we must take the inverse Fourier transform.
beginequation
Y_T(tau) = frac12pi int_-infty^infty domega e^-iomega tau Ap = Ap delta(tau).
endequation
Now we substitute in the definition of $phi_Y(t)$ and we get
beginequation
phi_Y(t) = lim_Tto infty frac1T int_- infty^infty dtau Apdelta(tau)Apdelta(tau + t)).
endequation
Now, I know that $int_- infty^infty dtau Apdelta(tau)Apdelta(tau + t)) = (Ap)^2 delta(t)$. However, I am still confused by the term $lim_Ttoinftyfrac1T$ and about the $p$ being introduced (although this might just mean amplitude I guess).
probability-distributions signal-processing correlation
I would like to prove, using the definition of autocorrelation, that Gaussian and uniform white noise have zero autocorrelation. I am working on the continuous case (but I think discrete shouldn't be too different).
Using the definition of the autocorrelation of this link shown on equation 9 (https://courses.edx.org/asset-v1:KyotoUx+009x+2T2017+type@asset+block@009x_31.pdf) we have that:
beginequation
phi_Y(t) = lim_Tto infty frac1T int_- infty^infty dtau Y_T(tau)Y_T(tau + t))
endequation
We want to prove that $phi_Y(t) = A^2delta(t)$.
Let $Y_T(t) = A xi(t)$ be white noise. If it is uniform white noise I think $xi(t) = frac1T$ for $t in (-T/2, T/2)$ and zero otherwise. If we substitute this in the definition we would have:
$$phi_Y(t) = A^2 lim_Tto infty frac1T int_- infty^infty dtau frac1T^2 $$
but this is not the $delta(t)$ function. Where am I wrong? See bottom for a better attempt, using the time representation.
I also tried with Gaussian white noise. Gaussian noise is given by: $Y_T(t) = frac1sqrt2pisigmae^-frac(t - mu)^22sigma^2$, if I include this in the definition of autocorrelation we have:
$$phi_Y(t) = lim_Tto infty frac1T sigma^2 2 pi int_- infty^infty dtau e^-frac(tau - mu)^22sigma^2 e^-frac(tau + t- mu)^22sigma^2$$
which can be re-written as
$$phi_Y(t) = lim_Tto infty frac1T sigma^2 2 pi int_- infty^infty dtau e^-frac(tau - mu)^2sigma^2 e^frac-t(t + 2tau - 2mu)^22sigma^2$$
Under some conditions I would expect the $delta(t)$ function to come up, but I don't see where.
Edit
The frequency representation of uniform white noise is a constant probability $tildexi(omega) = p in (0, 1)$. However, to obtain the time representation we must take the inverse Fourier transform.
beginequation
Y_T(tau) = frac12pi int_-infty^infty domega e^-iomega tau Ap = Ap delta(tau).
endequation
Now we substitute in the definition of $phi_Y(t)$ and we get
beginequation
phi_Y(t) = lim_Tto infty frac1T int_- infty^infty dtau Apdelta(tau)Apdelta(tau + t)).
endequation
Now, I know that $int_- infty^infty dtau Apdelta(tau)Apdelta(tau + t)) = (Ap)^2 delta(t)$. However, I am still confused by the term $lim_Ttoinftyfrac1T$ and about the $p$ being introduced (although this might just mean amplitude I guess).
probability-distributions signal-processing correlation
edited Jul 24 at 14:44
asked Jul 24 at 10:20
RM-
1337
1337
1
One issue in the first approach is that you didn't change your limits of integration. Note that your definition has inherent bounds on the domain. Moreover, you're looking at the frequency representation of white noise but your integral is over time. You would need to take the Fourier transform to get the time representation. The time representation should be a sinc. As for the second, you should try actually doing the integral, then take a limit. Here's a hint: take a Fourier transform. Something might be slightly off since you're missing $T$ dependence.
– Cameron Williams
Jul 24 at 10:36
Thanks! I am a bit confused on why I should change limits of integration? Since I have not done a change of variable. Good point on the frequency/time representation, I'll work on that. For the second, I think there is an extra restriction gaussian noise has to have to be white, which probably introduces the $T$ dependence?
– RM-
Jul 24 at 10:59
Made an edit, using your hint. Fourier transform to pass to the time domain. I still think there is something off with the T.
– RM-
Jul 24 at 14:45
add a comment |Â
1
One issue in the first approach is that you didn't change your limits of integration. Note that your definition has inherent bounds on the domain. Moreover, you're looking at the frequency representation of white noise but your integral is over time. You would need to take the Fourier transform to get the time representation. The time representation should be a sinc. As for the second, you should try actually doing the integral, then take a limit. Here's a hint: take a Fourier transform. Something might be slightly off since you're missing $T$ dependence.
– Cameron Williams
Jul 24 at 10:36
Thanks! I am a bit confused on why I should change limits of integration? Since I have not done a change of variable. Good point on the frequency/time representation, I'll work on that. For the second, I think there is an extra restriction gaussian noise has to have to be white, which probably introduces the $T$ dependence?
– RM-
Jul 24 at 10:59
Made an edit, using your hint. Fourier transform to pass to the time domain. I still think there is something off with the T.
– RM-
Jul 24 at 14:45
1
1
One issue in the first approach is that you didn't change your limits of integration. Note that your definition has inherent bounds on the domain. Moreover, you're looking at the frequency representation of white noise but your integral is over time. You would need to take the Fourier transform to get the time representation. The time representation should be a sinc. As for the second, you should try actually doing the integral, then take a limit. Here's a hint: take a Fourier transform. Something might be slightly off since you're missing $T$ dependence.
– Cameron Williams
Jul 24 at 10:36
One issue in the first approach is that you didn't change your limits of integration. Note that your definition has inherent bounds on the domain. Moreover, you're looking at the frequency representation of white noise but your integral is over time. You would need to take the Fourier transform to get the time representation. The time representation should be a sinc. As for the second, you should try actually doing the integral, then take a limit. Here's a hint: take a Fourier transform. Something might be slightly off since you're missing $T$ dependence.
– Cameron Williams
Jul 24 at 10:36
Thanks! I am a bit confused on why I should change limits of integration? Since I have not done a change of variable. Good point on the frequency/time representation, I'll work on that. For the second, I think there is an extra restriction gaussian noise has to have to be white, which probably introduces the $T$ dependence?
– RM-
Jul 24 at 10:59
Thanks! I am a bit confused on why I should change limits of integration? Since I have not done a change of variable. Good point on the frequency/time representation, I'll work on that. For the second, I think there is an extra restriction gaussian noise has to have to be white, which probably introduces the $T$ dependence?
– RM-
Jul 24 at 10:59
Made an edit, using your hint. Fourier transform to pass to the time domain. I still think there is something off with the T.
– RM-
Jul 24 at 14:45
Made an edit, using your hint. Fourier transform to pass to the time domain. I still think there is something off with the T.
– RM-
Jul 24 at 14:45
add a comment |Â
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2861188%2fmathematical-proof-that-white-noise-has-zero-autocorrelated%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
1
One issue in the first approach is that you didn't change your limits of integration. Note that your definition has inherent bounds on the domain. Moreover, you're looking at the frequency representation of white noise but your integral is over time. You would need to take the Fourier transform to get the time representation. The time representation should be a sinc. As for the second, you should try actually doing the integral, then take a limit. Here's a hint: take a Fourier transform. Something might be slightly off since you're missing $T$ dependence.
– Cameron Williams
Jul 24 at 10:36
Thanks! I am a bit confused on why I should change limits of integration? Since I have not done a change of variable. Good point on the frequency/time representation, I'll work on that. For the second, I think there is an extra restriction gaussian noise has to have to be white, which probably introduces the $T$ dependence?
– RM-
Jul 24 at 10:59
Made an edit, using your hint. Fourier transform to pass to the time domain. I still think there is something off with the T.
– RM-
Jul 24 at 14:45