What is Gaussian distribution or normal distribution

Normal distribution

One of the most important continuous probability distributions is the normal distribution. It was analyzed by Abraham de Moivre and later Carl Friedrich Gauß. Gauss's contribution was so fundamental that the normal distribution was often too Gaussian distribution is called. Because of its characteristic shape, it sometimes just becomes simple Bell curve called, even if there are many distribution functions that have a bell-shaped graph.

The possible uses of the normal distribution are so numerous that it can be described as the "Swiss Army Knife" of statistics.

definition

The appearance and properties of the normal distribution are determined by two parameters:

  • The Expected value µ. It determines at which point the normal distribution will have its maximum.
  • The Variance σ². The root of the variance σ is the standard deviation.

The total area enclosed by the curve of the normal distribution (hence the integral from -∞ to ∞) is always 1.

application

Intelligence, height (of a single gender), even social skills: all of these values ​​are normally distributed. This means, for example, that most people are average height and very few are very tall or very small. Even income is normally distributed if you first log the data.

The normal distribution is the most important distribution in statistics, and is used in both the natural sciences and the humanities and economics, the actual distribution function of which is unknown. It is mostly used when the actual distribution function underlying the data is unknown. One reason for the importance of normal distribution is that central limit theorem.

The Central Limit Theorem

On the main article: Central limit theorem.

It says that the average of a large number of observed random variables drawn from the same distribution will be approximately normally distributed, regardless of the distribution function from which they were taken. It is therefore the case that physical quantities, which are the sum of many different sub-processes (such as measurement errors, for example) often have a distribution function that approximately corresponds to the normal distribution.

If you take a sufficiently large sample from a population, the mean value of the sample will approximately correspond to the mean value of the population. In addition, all samples will be approximately normally distributed, with a variance equal to that of the population divided by the size of the sample.

Another reason for the popularity of the normal distribution is that other quantities can be derived analytically if one accepts them as a distribution function. One of these variables is, for example Error propagation.

Standard normal distribution

The simplest case occurs when µ = 0 and σ² = 1. For these values, the normal distribution is also Standard normal distribution called.

The pre-factor ensures that the entire area under the curve (and thus also the integral from -∞ to ∞) has an area of ​​exactly 1. The ½ in the exponent of the exponential function gives the normal distribution a unit variance (and thus also a unit standard deviation). The symmetry axis of the function is at x = 0, where it also has its maximum value reached. The two turning points are each at x = 1 and x = -1.

General normal distribution

Every normal distribution is a variant of the standard normal distribution. To adjust the function values, the standard normal distribution must be changed in two different ways:

  • The standard normal distribution must be increased by the factor be stretched. This also ensures that the area enclosed by the graph remains 1.
  • The function parameter of the standard normal distribution becomes z-transformed (also z-standardized). This type of transformation calculates a new value for a normal distribution function with the expected value zero and the variance one.

Notation

The standard normal distribution is often given in the literature with the small Greek letter ϕ (phi). The other notation of this letter φ is also used.

However, if you want to specify the normal distribution with the parameters for the expected value and the variance, you write . Therefore, if a random variable X with an expectation of µ and a variance of σ² is normally distributed, one writes:

Distribution function of the normal distribution

The distribution function of the normal function is the enclosed area under the normal function (hence the integral) from -∞ to the value x an. It has a gooseneck-shaped (sigmoid) graph.

Φ (x) is the symbol for the distribution function of the standard normal distribution. Below are the graphs of four distribution functions of four normal distributions for different values ​​of µ and σ.

erf (x) is the Gaussian error function. It belongs to the special functions and can only be represented as an infinite series or continued fraction (which is also infinite) (see definition below). It is usually calculated with a computer or pocket calculator with predefined functions, so it is usually not necessary to have in-depth knowledge of how to calculate it.

properties

The normal distribution is symmetrical, where x = µ forms the axis of symmetry. Even if the values ​​of the normal distribution asymptotically approach the value zero (towards both sides), the normal distribution is never 0 for any value of x.

The normal distribution also reaches values ​​close to zero for values ​​of x that are a few standard deviations from the expected value. It is therefore not necessarily the distribution function of choice if you expect a larger number of outliers (values ​​that are a few standard deviations from the expected value). The least squares method and other methods of statistical interference, which can be optimally applied to normally distributed variables, give only very unreliable results in such cases. If this is the case, end-load distributions (Heavy-tailed distribution) should be used instead.

The form of the density function is completely determined by the standard deviation σ. The smaller σ is, the steeper is the peak of the function around the expectation value; the larger σ, the flatter the graph.

The parameter µ, on the other hand, shifts the normal distribution along the x-axis. This is also self-explanatory, since the normal distribution always has its maximum at µ.

The changes from the standard deviation σ and the expected value µ and their effects on the graph of the normal distribution are summarized again in the graphics below:

Other properties

A normal distribution with an expected value and an arbitrary standard deviation σ has the following properties:

  • she is symmetrical, where the vertical axis of symmetry is at x = µ, which is also the mode, median and expected value of the distribution.
  • she is unimodal (it only has one peak).
  • It reaches her maximum at the point x = µ.
  • Your first derivative is positive for values ​​of x <µ and negative for values ​​of x> µ; at the point x = µ the first derivative has a value of zero.
  • She has exactly two turning points: both turning points are exactly one standard deviation away from the expected value, namely at x1 = µ - σ and x2 = µ + σ.
  • It is at every point of xdifferentiable.
  • she is steadily, therefore defined from -∞ to ∞.

Special properties of the standard normal distribution

The standard normal distribution, as a special variant of the normal distribution, also has the following properties:

  • Their first derivative ϕ '(x) is equal to -x ϕ (x)
  • The second derivative ϕ '' (x) is equal to (x²-1) · ϕ (x)

The normal distribution as an approximation to the binomial distribution

If n is sufficiently large (say n> 20), the skewness of the distribution is small enough that the Normal distribution to approximate the binomial distributionB (n, p) can be used. In this case it will used for the parameters of the normal distribution.

In general, the larger n is, the better the normal distribution approximates the binomial distribution. At the same time, p should not be close to 0 or 1 - therefore close to 0.5. There are a number of rules of thumb that help to make a statement about whether n and p were chosen adequately in order to use the normal distribution as an approximation:

  • a rule of thumb is that n · p and n (1-p) must each be greater than 5. However, there are also sources that give 4 or other numbers as the minimum value. In general, it also depends on how good the approximation should be. Therefore, there are also sources that give a minimum value of 10 - a value which is so high that certain convergence effects occur. That means that for n → ∞ the values ​​of the binomial distribution will correspond to the values ​​of the normal distribution.
  • Another rule of thumb states that the normal distribution can be used to approximate the binomial distribution if n> 5 and

68-95-99.7 rule

P (µ - σ ≤ x ≤ µ - σ) ≈ 0.6827
P (µ - 2σ ≤ x ≤ µ - 2σ) ≈ 0.9545
P (µ - 3σ ≤ x ≤ µ - 3σ) ≈ 0.9973The 68-95-99.7 rule states that with a normal distribution almost all values ​​within three standard deviations fall from the mean. Approximately 68.27% of the values ​​are within one standard deviation of the mean. Likewise, approximately 95.45% of the values ​​are within two deviations from the mean. And about 99.73% of the values ​​are within three standard deviations from the mean.

This rule applies to all normal distributions - regardless of the expected value and the standard deviation.

example

The height of the human being is normally distributed for one gender. According to statistics from the Socio-Economic Panel (SOEP) from 2006, the expected average height µ for women in Germany is 165.4 cm, and the standard deviation σ is 4.5 cm.

From the 68-95-99.7 rule it follows that

  • 68% of all German women have a height between 160.9 cm (µ-σ) and 169.9 cm (µ + σ)
  • 95% of all German women have a height between 156.4 cm (µ-2σ) and 174.4 cm (µ + 2σ)
  • 99,7% of all German women have a height between 151.9 cm (µ-3σ) and 178.9 cm (µ + 3σ)

The figure on the right shows the 68-95-99.7 rule again graphically.

Table of populations in relation to the distance between the standard deviation and the expected value

AreaPopulation in the areaExpected frequency out of rangeApproximate occurrence in a daily event
μ ± 1σ0,6826894921370861 of 3two times a week
μ ± 1.5σ0,8663855974622841 of 7weekly
μ ± 2σ0,9544997361036421 out of 22every three weeks
μ ± 2.5σ0,9875806693484481 out of 81quarterly
μ ± 3σ0,9973002039367401 in 370yearly
μ ± 3.5σ0,9995347418419291 in 2.149every six years
μ ± 4σ0,9999366575163341 in 15,787every 43 years (twice in a lifetime)
μ ± 4.5σ0,9999932046537511 in 147.160every 403 years
μ ± 5σ0,9999994266968561 in 1,744,278every 4,776 years (once in known historiography)
μ ± 5.5σ0,9999999620208751 in 26,330,254every 72,090 years
μ ± 6σ0,9999999980268251 in 506,797,346every 1.38 million years (age of mankind)
μ ± 6.5σ0,9999999999196801 in 12,450,197,393every 34 million years
μ ± 7σ0,9999999999974401 in 390,682,215,445every billion years
μ ± σ1 offall Days

Interactive normal distribution

By changing the expected value and the standard deviation, the position or the shape of the graph of the normal distribution also changes. To do this, simply change the values ​​using the two sliders below.

Normal distribution calculator

The normal distribution calculator calculates the value of the distribution function, the cumulative distribution function, as well as quartiles and critical values.

$$ \ large P (X = k) \, = \, f (k; \, n, \, p) \, = \, {n \ choose k} \ cdot p ^ k \ cdot (1-p) ^ {nk} $$


Calculation result

$$ \ large F (k; \, n, \, p) \, = \, P (X \ le k) \, = \, \ sum_ {i = 0} ^ {\ lfloor k \ rfloor} {n \ choose i} \ cdot p ^ i \ cdot (1-p) ^ {ni} $$


Calculation result

$$ \ large P (X \ ge k) \, = \, \ sum_ {i = \ lfloor k \ rfloor} ^ {n} {n \ choose i} \ cdot p ^ i \ cdot (1-p) ^ {ni} $$


Calculation result