Introduction
When processing and analyzing signals, the noise component makes it more difficult to draw meaningful conclusions on specific signal samples. In order to cope adequately with the uncertainty involved, deterministic signals can better be regarded as random signals, where the exact outcome in time is unknown, but where conclusions can be drawn from the statistical properties of the signal.
The following video gives an overview of all the topics treated in the module. Starting from a couple of examples of random processes, it describes the statistical tools that we can use to describe random processes. Finally, it introduces two important properties of random processes: stationarity and ergodicity. It is recommended to watch the video before going through the module, and once again after completing the module.
Screencast video [⯈]
Module overview
This module will cover the following topics:
- Random signals - This section will discuss the concept of random signals. Several signal statistics are discussed and some properties are elaborated upon.
- Stationarity and ergodicitity - In all practical applications, the exact signal statistics are unknown. These therefore need to be approximated based on a limited set of data. The assumptions of stationarity and ergodicity permit calculation of signal statistics from a single observation of a random signal.
- Power spectral density - Analysis of the second order statistics, such as the auto-correlation and power spectral density, can provide important insights into a random signal. From the Wiener-Khinchin theorem it follows that the power spectral density of a random signal is given by the Fourier transform of the auto-correlation function.