What does geometric probability distribution mean

content
»Preliminary remark
»Modeling
“Examples
" Remarks

Preliminary remark

The geometric probability distribution can be derived from the binomial distribution or simply by considering the tree diagram. It is also based on a Bernoullie experiment, which means that we have two test results and a constant "hit" probability \ (p \).

The geometric probability distribution is also called a waiting time distribution. It calculates the probability of how long (\ (n \)) it will take for a success to occur.

Modeling

If we look at the roll of the dice, the geometric distribution calculates the probability of getting a 6 on the \ (n \) th roll for the first time. We can easily deduce this with the help of our tree diagrams.

[A picture will be inserted here soon]

We conclude from this

 

\ (X = n \)\ (X = 1 \)\ (X = 2 \)\ (X = 3 \)\ (X = 4 \)
\ (P (X = n) \)\ (\ frac {1} {6} \)\ (\ frac {5} {6} \ frac {1} {6} \)\ ((\ frac {5} {6}) ^ 2 \ frac {1} {6} \)\ ((\ frac {5} {6}) ^ 3 \ frac {1} {6} \)

 

The formula

The probability that one needs exactly \ (n \) attempts to get the first hit in a Bernoullie experiment is called geometrically distributed and is given by
\ begin {align *}
P (X = n) = (1-p) ^ {n-1} \ cdot p
\ end {align *}

The geometric distribution function has a relatively simple explicit cumulative distribution function \ (F \)
\ begin {align *}
P (X \ leq n) = F (n) = 1- (1-p) ^ n.
\ end {align *}

Expectation and variance

The expected value \ (E (X) \) of the geometrically distributed random variable is given by
\ begin {align *}
E (X) = \ frac {1} {6}.
\ end {align *}
The variance is given by
\ begin {align *}
Var (X) = \ frac {1-p} {p ^ 2}.
\ end {align *}

Examples

Light bulbs: A light bulb is defective every time it is switched on with a probability of \ (p = \ frac {1} {1000} \). According to the manufacturer, it has a lifespan of 3000 switch-ons. Is this statement true for 90 percent of light bulbs?

Solution:

We are interested in the probability that the defect will not occur before the 3001 switch-on at the earliest, i.e. \ (P (X \ geq 3001) \), because we are not complaining if it does not happen until the 3002nd time. Now we use the counter event and with it the cumulative distribution function
\ begin {align *}
P (X \ geq 3001) & = 1-P (X \ leq 3000) \
& = 1- (1- \ frac {1} {1000}) ^ {3000} \ approx 0.9503.
\ end {align *}
The manufacturer is right, the statement even occurs in 95.03 percent of the cases.

The three-minimum example: This example has been given a nickname because of the often similar task. Ultimately, however, it is a task of algebra in the "guise of probability distribution", we have already solved a similar example with the binomial distribution here Link: How often do you have to throw a dice at least so that at least a 6 is thrown with a probability of at least 95 percent ?

Solution:

So we want the probability of rolling a 6 in the waiting time \ (n \) (attempts) to be at least \ (0.95 \). That means
\ begin {align *}
P (X \ leq n) \ geq 0.95 \
1- (1- \ frac {1} {6}) ^ n \ geq 0.95 \
0.05 \ geq (1- \ frac {1} {6}) ^ n \
\ log (0.05) \ geq n \ cdot \ log (\ frac {5} {6}) \
\ frac {\ log (0.05)} {\ log (\ frac {5} {6})} \ leq n \
16.43 \ leq n
\ end {align *}

 

 


So we have to roll the dice at least 17 times in order to have at least a 95 percent chance of having rolled at least a 6.

Remarks

Due to the relationship to the binomial distribution one writes in many textbooks and here also \ (P (X = n) \) instead of \ (P (X = x) \) or \ (P (X = k) \) because it is " is a branch of the binomial distribution with \ (n \) and \ (p \) ", unlike the binomial distribution, however, the time of the hit is important to us.

Incidentally, the cumulative distribution function can be derived from the geometric series. It applies
\ begin {align *}
F (n) = P (X \ leq n) = 1- (1-p) ^ n.
\ end {align *}