"Autocorrelation" is used to compare a signal with a time-delayed version of itself. If a signal is periodic, then the signal will be perfectly correlated with a version of itself if the time-delay is an integer number of periods. That fact, along with related experiments, has implicated autocorrelation as a potentially important part of signal processing in human hearing.
Mathematically, the autocorrelation corresponding to a delay time τ is calculated by
Mathematically, for a continuous signal, s(t), the autocorrelation, R(τ)
is calculated using:
.
Sometimes it is convenient if the overall amplitude of the result is scaled, for example so that the amplitude of the autocorrelation for τ = 0 is 1 -- e.g. R(0) = 1. For that choice, then when τ = 0 the signal must be "perfectly correlated." That is, you are comparing a signal with an exact copy of itself. If, for any larger values of τ, you also get a value of (exactly) 1, then you know that the signal delayed by a time τ is identical to the signal with no delay. In that case the signal must be periodic. When the autocorrelation amplitude is adjusted to some fixed value, such as 1, it has been "normalized."
Any sound can be written as a sum of sinusoidal functions. Sinusoidal functions include the sine function and the cosine function. More generally, you can write a sinusoidal function using a phase shift. A cosine wave is the same as a sine wave except with a phase shift. For an individual sinusoidal function, a phase shift is the same as a time delay.
If you have a complex sustained sound, such as from a musical instrument,
then you can describe the sound by looking at the different frequency
components -- that is, think of the wave as a sum of sinusoidal
functions, and then look at each of those sinusoids separately. When you
write out the series of sinusoidal functions, you are writing out
the "Fourier Series" describing the sound. For such a series, each term
has an amplitude, An, and a phase, φn. One can
use sine functions, cosine functions, or an appropriate combination. When using
only cosines, it might be referred to as the "Fourier cosine series."
.
In practice a mathematical equality might be achieved only in the limit
of an infinite number of terms. For sound signals in practice, one can
always use a finite number of terms to get an
approximate signal which is "close enough"
so that no significant difference is present.
The (normalized) autocorrelation can be easily computed if the amplitudes of the
Fourier Series components are known. Take the Fourier cosine series, change all
the phases to zero and square all the amplitudes. To normalize as
described above, divide by
the value for t = 0 (which is the sum of the squared amplitudes). That is,
.
The autocorrelation for a signal is closely related to the power spectrum for that signal.
The autocorrelation has been implicated as part of the human hearing process since signals which are not periodic, but which have autocorrelations which approach 1 (for specific values of τ > 0), are often interpreted (i.e. "heard") as periodic with a period corresponding to the value of τ where the autocorrelation is close to 1.
Questions/comments to: suits @ mtu.edu.
There are no pop-ups or ads of any kind on these pages. If you are seeing them, they are being added by a third party without the consent of the author.
Physics of Music Notes