Smoothing (math)

from Wikipedia, the free encyclopedia

In a mathematical context, smoothing means converting a curve into a curve with less curvature, which at the same time deviates as little as possible from the original. In this sense, low-order approximation polynomials meet the smoothing requirements very well. Smoothing is often used synonymously with the word filtering. In contrast to smoothing, filtering in mathematics means removing certain components or features of a curve, usually frequency components or noise . Many, but not all, filters also have smoothing properties.

The method that most strictly fulfills the characteristic of smoothing is the Whittaker-Henderson method. Here the optimum between smoothness (minimum mean square nth derivative) and accuracy (minimum error square to the original) is calculated. The ratio of the two sizes is specified as a freely selectable parameter.

Smoothing method from statistics

Adjustment calculation
Finds the best-approximating parameters for a given set of data and a given model
Regression analysis
Finds relationships between the given data
Local regression
Regression analysis with local (mostly bell-shaped) weighting of the surrounding values

Smoothing method from image and signal processing

LTI filter

The Fourier analysis forms the theoretical basis for LTI filters. It breaks down a function into a series of sine functions with different frequencies. High frequencies can then be deleted selectively from this frequency spectrum .

It is not absolutely necessary to actually calculate the spectrum, because there is an equivalent method to frequencies from a signal filter out. The so-called convolution of the signal with a filter core (often only filter called). Example: Convolution with the rectangular filter . It simply consists of replacing the value at each point on the signal with the mean of its neighbors. More complex filters are characterized by the fact that they represent weighted mean values.

In the context of one-dimensional signals such as tone or voltage curves, filters that suppress high frequencies are called low-pass filters . In the context of two-dimensional signals such as images, one speaks of blurring . Various such filters are available. They differ in the weight with which neighboring values ​​are included in the mean. Some well-known filters are:

Rectangular filter
Its use can lead to artifacts as it regularly shifts frequencies by half a period length.
Sinc filter
represents the ideal low pass, i.e. In other words, it completely eliminates frequencies above the desired limit - all others remain untouched.
Gaussian filter
attenuates frequencies the higher they are.
Exponential smoothing and moving averages
are often used in time series. The weighting of the values ​​falls exponentially with age. The most recent data carry the greatest weight.

Nonlinear Filters

Since the general suppression of high frequencies also "blurs" edges, there are other methods that try to preserve them:

Ranking filter
In contrast to the rectangular filter, they do not use the mean value but, for example, the median or the maximum.
Sigma filter
Reduces the amount of noise in images without distorting the edges.

Individual evidence

  1. ^ Whittaker, ET: On a new method of graduation. In: Proceedings of the Edinburgh Mathematical Society 41 (1923), pp. 63-75, doi : 10.1017 / S0013091500077853 .
  2. The Whittaker-Henderson method is also known in economics as the Hodrick-Prescott filter and, according to this reference, goes back to the astronomer Schiaparelli (1867). RJ Hodrick, EC Prescott: Postwar US Business Cycles: An Empirical Investigation. In: Journal of Money, Credit & Banking 29 (1997), Feb, No. 1, pp. 1-16, JSTOR 2953682 .