Binomial distribution of a discrete random variable.

  • The date: 22.09.2019

In this and the next few notes, we will consider mathematical models of random events. Mathematical model is a mathematical expression representing a random variable. For discrete random variables, this mathematical expression is known as the distribution function.

If the problem allows you to explicitly write a mathematical expression representing a random variable, you can calculate the exact probability of any of its values. In this case, you can calculate and list all values ​​of the distribution function. In business, sociological and medical applications there are various distributions of random variables. One of the most useful distributions is the binomial.

Binomial distribution is used to model situations characterized by the following features.

  • The sample consists of a fixed number of elements n representing the outcome of some test.
  • Each sample element belongs to one of two mutually exclusive categories that cover the entire sample space. Typically, these two categories are called success and failure.
  • Probability of Success R is constant. Therefore, the probability of failure is 1 - p.
  • The outcome (i.e. success or failure) of any trial is independent of the outcome of another trial. To ensure independence of outcomes, sample items are usually obtained using two different methods. Each sample element is randomly drawn from an infinite population without replacement or from a finite population with replacement.

Download note in or format, examples in format

The binomial distribution is used to estimate the number of successes in a sample consisting of n observations. Let's take ordering as an example. Saxon Company customers can use an interactive electronic form to place an order and send it to the company. Then the information system checks whether there are any errors in the orders, as well as incomplete or inaccurate information. Any order in doubt is flagged and included in the daily exceptional situations. The data collected by the company indicates that the probability of errors in orders is 0.1. The company would like to know what is the probability of finding a certain number of erroneous orders in a given sample. For example, suppose customers have completed four electronic forms. What is the probability that all orders will be error-free? How to calculate this probability? By success, we mean an error when filling out the form, and we will consider all other outcomes as failure. Recall that we are interested in the number of erroneous orders in a given sample.

What outcomes can we observe? If the sample consists of four orders, one, two, three or all four may be wrong, in addition, all of them may be correctly filled. Can the random variable describing the number of incorrectly completed forms take on any other value? This is not possible because the number of incorrectly completed forms cannot exceed the sample size n or be negative. Thus, a random variable obeying the binomial distribution law takes values ​​from 0 to n.

Suppose that in a sample of four orders, the following outcomes are observed:

What is the probability of finding three erroneous orders in a sample of four orders, and in the specified order? Since preliminary studies have shown that the probability of an error in completing the form is 0.10, the probabilities of the above outcomes are calculated as follows:

Since the outcomes are independent of each other, the probability of the indicated sequence of outcomes is equal to: p*p*(1–p)*p = 0.1*0.1*0.9*0.1 = 0.0009. If it is necessary to calculate the number of choices X n elements, you should use the combination formula (1):

where n! \u003d n * (n -1) * (n - 2) * ... * 2 * 1 - factorial of the number n, and 0! = 1 and 1! = 1 by definition.

This expression is often referred to as . Thus, if n = 4 and X = 3, the number of sequences consisting of three elements, extracted from a sample of size 4, is determined by the following formula:

Therefore, the probability of finding three erroneous orders is calculated as follows:

(number of possible sequences) *
(probability of a particular sequence) = 4 * 0.0009 = 0.0036

Similarly, we can calculate the probability that among the four orders one or two are wrong, as well as the probability that all orders are wrong or all are correct. However, as the sample size increases n it becomes more difficult to determine the probability of a particular sequence of outcomes. In this case, apply the appropriate mathematical model describing the binomial distribution of the number of choices X objects from a sample containing n elements.

Binomial distribution

where P(X)- probability X success for a given sample size n and probability of success R, X = 0, 1, … n.

Pay attention to the fact that formula (2) is a formalization of intuitive conclusions. Random value X, obeying the binomial distribution, can take any integer value in the range from 0 to n. Work RX(1 - p)nX is the probability of a particular sequence consisting of X successes in the sample, the size of which is equal to n. The value determines the number of possible combinations consisting of X success in n tests. Therefore, for a given number of trials n and probability of success R the probability of a sequence consisting of X success is equal to

P(X) = (number of possible sequences) * (probability of a particular sequence) =

Consider examples illustrating the application of formula (2).

1. Let's assume that the probability of filling out the form incorrectly is 0.1. What is the probability that three of the four completed forms will be wrong? Using formula (2), we obtain that the probability of finding three erroneous orders in a sample of four orders is equal to

2. Assume that the probability of incorrectly completing the form is 0.1. What is the probability that at least three out of four completed forms will be wrong? As shown in the previous example, the probability that three of the four completed forms will be wrong is 0.0036. To calculate the probability that at least three of the four completed forms will be incorrectly completed, you must add the probability that among the four completed forms three will be wrong, and the probability that among the four completed forms all will be wrong. The probability of the second event is

Thus, the probability that among the four completed forms at least three will be erroneous is equal to

P(X > 3) = P(X = 3) + P(X = 4) = 0.0036 + 0.0001 = 0.0037

3. Assume that the probability of incorrectly completing the form is 0.1. What is the probability that less than three out of four completed forms will be wrong? The probability of this event

P(X< 3) = P(X = 0) + P(X = 1) + P(X = 2)

Using formula (2), we calculate each of these probabilities:

Therefore, P(X< 3) = 0,6561 + 0,2916 + 0,0486 = 0,9963.

Probability P(X< 3) можно вычислить иначе. Для этого воспользуемся тем, что событие X < 3 является дополнительным по отношению к событию Х>3. Then P(X< 3) = 1 – Р(Х> 3) = 1 – 0,0037 = 0,9963.

As the sample size increases n calculations similar to those carried out in example 3 become difficult. To avoid these complications, many binomial probabilities are tabulated ahead of time. Some of these probabilities are shown in Fig. 1. For example, to get the probability that X= 2 at n= 4 and p= 0.1, you should extract from the table the number at the intersection of the line X= 2 and columns R = 0,1.

Rice. 1. Binomial probability at n = 4, X= 2 and R = 0,1

The binomial distribution can be calculated using the Excel function =BINOM.DIST() (Fig. 2), which has 4 parameters: the number of successes - X, number of trials (or sample size) – n, the probability of success is R, parameter integral, which takes the values ​​TRUE (in this case, the probability is calculated at least X events) or FALSE (in this case, the probability of exactly X events).

Rice. 2. Function parameters =BINOM.DIST()

For the above three examples, the calculations are shown in fig. 3 (see also Excel file). Each column contains one formula. The numbers show the answers to the examples of the corresponding number).

Rice. 3. Calculation binomial distribution in Excel for n= 4 and p = 0,1

Properties of the binomial distribution

The binomial distribution depends on the parameters n and R. The binomial distribution can be either symmetric or asymmetric. If p = 0.05, the binomial distribution is symmetric regardless of the parameter value n. However, if p ≠ 0.05, the distribution becomes skewed. The closer the parameter value R to 0.05 and the larger the sample size n, the weaker is the asymmetry of the distribution. Thus, the distribution of the number of incorrectly completed forms is shifted to the right, since p= 0.1 (Fig. 4).

Rice. 4. Histogram of the binomial distribution for n= 4 and p = 0,1

Mathematical expectation of the binomial distribution is equal to the product of the sample size n on the likelihood of success R:

(3) M = E(X) =np

On average, with a sufficiently long series of tests in a sample of four orders, there may be p \u003d E (X) \u003d 4 x 0.1 \u003d 0.4 incorrectly completed forms.

Binomial distribution standard deviation

For example, the standard deviation of the number of incorrectly completed forms in an accounting information system is:

Materials from the book Levin et al. Statistics for managers are used. - M.: Williams, 2004. - p. 307–313

Chapter 7

Specific laws of distribution of random variables

Types of laws of distribution of discrete random variables

Let a discrete random variable take the values X 1 , X 2 , …, x n, … . The probabilities of these values ​​can be calculated from various formulas, for example, using the basic theorems of probability theory, the Bernoulli formula, or some other formulas. For some of these formulas, the distribution law has its own name.

The most common laws of distribution of a discrete random variable are binomial, geometric, hypergeometric, Poisson's distribution law.

Binomial distribution law

Let it be produced n independent trials, in each of which an event may or may not occur BUT. The probability of the occurrence of this event in each single trial is constant, does not depend on the trial number and is equal to R=R(BUT). Hence the probability that the event will not occur BUT in each test is also constant and equal to q=1–R. Consider a random variable X equal to the number of occurrences of the event BUT in n tests. It is obvious that the values ​​of this quantity are equal to

X 1 =0 - event BUT in n tests did not appear;

X 2 =1 – event BUT in n trials appeared once;

X 3 =2 - event BUT in n trials appeared twice;

…………………………………………………………..

x n +1 = n- event BUT in n tests appeared everything n once.

The probabilities of these values ​​can be calculated using the Bernoulli formula (4.1):

where to=0, 1, 2, …,n .

Binomial distribution law X equal to the number of successes in n Bernoulli trials, with a probability of success R.

So, a discrete random variable has a binomial distribution (or is distributed according to the binomial law) if its possible values ​​are 0, 1, 2, …, n, and the corresponding probabilities are calculated by formula (7.1).

The binomial distribution depends on two parameters R and n.

The distribution series of a random variable distributed according to the binomial law has the form:

X k n
R

Example 7.1 . Three independent shots are fired at the target. The probability of hitting each shot is 0.4. Random value X- the number of hits on the target. Construct its distribution series.

Decision. Possible values ​​of a random variable X are X 1 =0; X 2 =1; X 3 =2; X 4=3. Find the corresponding probabilities using the Bernoulli formula. It is easy to show that the application of this formula here is fully justified. Note that the probability of not hitting the target with one shot will be equal to 1-0.4=0.6. Get

The distribution series has the following form:

X
R 0,216 0,432 0,288 0,064

It is easy to check that the sum of all probabilities is equal to 1. The random variable itself X distributed according to the binomial law. ■

Let's find the mathematical expectation and variance of a random variable distributed according to the binomial law.

When solving example 6.5, it was shown that the mathematical expectation of the number of occurrences of an event BUT in n independent tests, if the probability of occurrence BUT in each test is constant and equal R, equals n· R

In this example, a random variable was used, distributed according to the binomial law. Therefore, the solution of Example 6.5 is, in fact, a proof of the following theorem.

Theorem 7.1. The mathematical expectation of a discrete random variable distributed according to the binomial law is equal to the product of the number of trials and the probability of "success", i.e. M(X)=n· R.

Theorem 7.2. The variance of a discrete random variable distributed according to the binomial law is equal to the product of the number of trials by the probability of "success" and the probability of "failure", i.e. D(X)=npq.

Skewness and kurtosis of a random variable distributed according to the binomial law are determined by the formulas

These formulas can be obtained using the concept of initial and central moments.

The binomial distribution law underlies many real situations. At large values n the binomial distribution can be approximated by other distributions, in particular the Poisson distribution.

Poisson distribution

Let there be n Bernoulli trials, with the number of trials n large enough. Previously, it was shown that in this case (if, in addition, the probability R events BUT very small) to find the probability that an event BUT to appear t once in the tests, you can use the Poisson formula (4.9). If the random variable X means the number of occurrences of the event BUT in n Bernoulli trials, then the probability that X will take on the meaning k can be calculated by the formula

, (7.2)

where λ = np.

Poisson distribution law is called the distribution of a discrete random variable X, for which possible values are non-negative integers, and the probabilities p t these values ​​are found by formula (7.2).

Value λ = np called parameter Poisson distribution.

A random variable distributed according to Poisson's law can take on an infinite number of values. Since for this distribution the probability R occurrence of an event in each trial is small, then this distribution is sometimes called the law of rare phenomena.

The distribution series of a random variable distributed according to the Poisson law has the form

X t
R

It is easy to verify that the sum of the probabilities of the second row is equal to 1. To do this, we need to remember that the function can be expanded in a Maclaurin series, which converges for any X. In this case we have

. (7.3)

As noted, Poisson's law in certain limiting cases replaces the binomial law. An example is a random variable X, the values ​​of which are equal to the number of failures for a certain period of time with repeated use of a technical device. It is assumed that this device is of high reliability, i.e. the probability of failure in one application is very small.

In addition to such limiting cases, in practice there are random variables distributed according to the Poisson law, not related to the binomial distribution. For example, the Poisson distribution is often used when dealing with the number of events that occur in a period of time (the number of calls to the telephone exchange during the hour, the number of cars that arrived at the car wash during the day, the number of machine stops per week, etc. .). All these events must form the so-called flow of events, which is one of the basic concepts of queuing theory. Parameter λ characterizes the average intensity of the flow of events.

Example 7.2 . The faculty has 500 students. What is the probability that September 1st is the birthday of three students in this faculty?

Decision . Since the number of students n=500 is large enough and R– the probability of being born on the first of September to any of the students is , i.е. small enough, then we can assume that the random variable X– the number of students born on the first of September is distributed according to the Poisson law with the parameter λ = np= =1.36986. Then, according to formula (7.2), we obtain

Theorem 7.3. Let the random variable X distributed according to Poisson's law. Then its mathematical expectation and variance are equal to each other and equal to the value of the parameter λ , i.e. M(X) = D(X) = λ = np.

Proof. By the definition of mathematical expectation, using formula (7.3) and the distribution series of a random variable distributed according to the Poisson law, we obtain

Before finding the variance, we first find the mathematical expectation of the square of the considered random variable. We get

Hence, by the definition of dispersion, we obtain

The theorem has been proven.

Applying the concepts of initial and central moments, it can be shown that for a random variable distributed according to the Poisson law, the skewness and kurtosis coefficients are determined by the formulas

It is easy to understand that, since the semantic content of the parameter λ = np is positive, then a random variable distributed according to Poisson's law always has positive both skewness and kurtosis.

The theory of probability is invisibly present in our lives. We do not pay attention to it, but every event in our life has one or another probability. Given the huge number of possible scenarios, it becomes necessary for us to determine the most likely and least likely of them. It is most convenient to analyze such probabilistic data graphically. Distribution can help us with this. Binomial is one of the easiest and most accurate.

Before moving directly to mathematics and probability theory, let's figure out who was the first to come up with this type of distribution and what is the history of the development of the mathematical apparatus for this concept.

Story

The concept of probability has been known since ancient times. However, ancient mathematicians did not attach much importance to it and were only able to lay the foundations for a theory that later became the theory of probability. They created some combinatorial methods that greatly helped those who later created and developed the theory itself.

In the second half of the seventeenth century, the formation of the basic concepts and methods of probability theory began. Definitions of random variables, methods for calculating the probability of simple and some complex independent and dependent events were introduced. Such an interest in random variables and probabilities was dictated by gambling: Each person wanted to know what his chances of winning the game were.

The next step was the application of methods of mathematical analysis in probability theory. Eminent mathematicians such as Laplace, Gauss, Poisson and Bernoulli took up this task. It was they who advanced this area of ​​mathematics to a new level. It was James Bernoulli who discovered the binomial distribution law. By the way, as we will later find out, on the basis of this discovery, several more were made, which made it possible to create the law of normal distribution and many others.

Now, before we begin to describe the binomial distribution, we will refresh a little in the memory of the concepts of probability theory, probably already forgotten from the school bench.

Fundamentals of Probability Theory

We will consider such systems, as a result of which only two outcomes are possible: "success" and "failure". This is easy to understand with an example: we toss a coin, guessing that tails will fall out. The probabilities of each of the possible events (tails - "success", heads - "not success") are equal to 50 percent with the coin perfectly balanced and there are no other factors that can affect the experiment.

It was the simplest event. But there are also complex systems, in which sequential actions are performed, and the probabilities of the outcomes of these actions will differ. For example, consider the following system: in a box whose contents we cannot see, there are six absolutely identical balls, three pairs of blue, red and white flowers. We have to get a few balls at random. Accordingly, by pulling out one of the white balls first, we will reduce by several times the probability that the next one we will also get a white ball. This happens because the number of objects in the system changes.

In the next section, we will look at more complex mathematical concepts that bring us close to what the words "normal distribution", "binomial distribution" and the like mean.

Elements of mathematical statistics

In statistics, which is one of the areas of application of the theory of probability, there are many examples where the data for analysis is not given explicitly. That is, not in numbers, but in the form of division according to characteristics, for example, according to gender. In order to apply a mathematical apparatus to such data and draw some conclusions from the results obtained, it is required to convert the initial data into a numerical format. As a rule, to implement this, a positive outcome is assigned a value of 1, and a negative one is assigned a value of 0. Thus, we obtain statistical data that can be analyzed using mathematical methods.

The next step in understanding what the binomial distribution of a random variable is is to determine the variance of the random variable and the mathematical expectation. We'll talk about this in the next section.

Expected value

In fact, understanding what mathematical expectation is is not difficult. Consider a system in which there are many different events with their own different probabilities. Mathematical expectation will be called a value equal to the sum of the products of the values ​​of these events (in the mathematical form that we talked about in the last section) and the probability of their occurrence.

The mathematical expectation of the binomial distribution is calculated according to the same scheme: we take the value of a random variable, multiply it by the probability of a positive outcome, and then summarize the obtained data for all variables. It is very convenient to present these data graphically - this way the difference between the mathematical expectations of different values ​​is better perceived.

In the next section, we will tell you a little about a different concept - the variance of a random variable. It is also closely related to such a concept as the binomial probability distribution, and is its characteristic.

Binomial distribution variance

This value is closely related to the previous one and also characterizes the distribution of statistical data. It represents the mean square of deviations of values ​​from their mathematical expectation. That is, the variance of a random variable is the sum of the squared differences between the value of a random variable and its mathematical expectation, multiplied by the probability of this event.

In general, this is all we need to know about variance in order to understand what the binomial probability distribution is. Now let's move on to our main topic. Namely, what lies behind this seemingly enough complex phrase"binomial distribution law".

Binomial distribution

Let's first understand why this distribution is binomial. It comes from the word "binom". You may have heard of Newton's binomial - a formula that can be used to expand the sum of any two numbers a and b to any non-negative power of n.

As you probably already guessed, Newton's binomial formula and the binomial distribution formula are almost the same formulas. With the only exception that the second has an applied value for specific quantities, and the first is only a general mathematical tool, the applications of which in practice can be different.

Distribution formulas

The binomial distribution function can be written as the sum of the following terms:

(n!/(n-k)!k!)*p k *q n-k

Here n is the number of independent random experiments, p is the number of successful outcomes, q is the number of unsuccessful outcomes, k is the number of the experiment (it can take values ​​from 0 to n),! - designation of a factorial, such a function of a number, the value of which is equal to the product of all the numbers going up to it (for example, for the number 4: 4!=1*2*3*4=24).

In addition, the binomial distribution function can be written as an incomplete beta function. However, this is already a more complex definition, which is used only when solving complex statistical problems.

The binomial distribution, examples of which we examined above, is one of the most simple species distributions in probability theory. There is also a normal distribution, which is a type of binomial distribution. It is the most commonly used, and the easiest to calculate. There is also a Bernoulli distribution, a Poisson distribution, a conditional distribution. All of them characterize graphically the areas of probability of a particular process under different conditions.

In the next section, we will consider aspects related to the application of this mathematical apparatus in real life. At first glance, of course, it seems that this is another mathematical thing, which, as usual, does not find application in real life, and is generally not needed by anyone except mathematicians themselves. However, this is not the case. After all, all types of distributions and their graphical representations were created solely for practical purposes, and not as a whim of scientists.

Application

Certainly the most important application distributions are found in statistics, because they need complex analysis lots of data. As practice shows, very many data arrays have approximately the same distributions of values: the critical regions of very low and very high values, as a rule, contain fewer elements than the average values.

Analysis of large data arrays is required not only in statistics. It is indispensable, for example, in physical chemistry. In this science, it is used to determine many quantities that are associated with random vibrations and movements of atoms and molecules.

In the next section, we will understand how important it is to apply statistical concepts such as binomial distribution of a random variable in Everyday life for you and me.

Why do I need it?

Many people ask themselves this question when it comes to mathematics. And by the way, mathematics is not in vain called the queen of sciences. It is the basis of physics, chemistry, biology, economics, and in each of these sciences, some kind of distribution is also used: whether it is a discrete binomial distribution or a normal one, it does not matter. And if we take a closer look at the world around us, we will see that mathematics is applied everywhere: in everyday life, at work, and even human relations can be presented in the form of statistical data and analyzed (this, by the way, is done by those who work in special organizations that collect information).

Now let's talk a little about what to do if you need to know much more on this topic than what we have outlined in this article.

The information that we have given in this article is far from complete. There are many nuances as to what form the distribution might take. The binomial distribution, as we have already found out, is one of the main types on which all mathematical statistics and probability theory are based.

If you become interested, or in connection with your work you need to know much more on this topic, you will need to study the specialized literature. Start with a university course mathematical analysis and get there to the section of probability theory. Also knowledge in the field of series will be useful, because the binomial probability distribution is nothing more than a series of successive terms.

Conclusion

Before finishing the article, we would like to tell you one more interesting thing. It concerns directly the topic of our article and all mathematics in general.

Many people say that mathematics is a useless science, and nothing that they learned in school was useful to them. But knowledge is never superfluous, and if something is not useful to you in life, it means that you simply do not remember it. If you have knowledge, they can help you, but if you do not have them, then you cannot expect help from them.

So, we examined the concept of the binomial distribution and all the definitions associated with it and talked about how it is applied in our lives.

Hello! We already know what a probability distribution is. It can be discrete or continuous, and we have learned that it is called the probability density distribution. Now let's explore a couple of more common distributions. Suppose I have a coin, and the correct coin, and I'm going to flip it 5 times. I will also define a random variable X, denote it capital letter X, it will be equal to the number of "eagles" in 5 tossings. Maybe I have 5 coins, I will toss them all at once and count how many heads I got. Or I could have one coin, I could flip it 5 times and count how many times I got heads. It doesn't really matter. But let's say I have one coin and I flip it 5 times. Then we will have no uncertainty. So here is the definition of my random variable. As we know, a random variable is slightly different from a regular variable, it is more like a function. It assigns some value to the experiment. And this random variable is quite simple. We simply count how many times the “eagle” fell out after 5 tosses - this is our random variable X. Let's think about what probabilities can be different values in our case? So, what is the probability that X (capital X) is 0? Those. What is the probability that after 5 tosses it will never come up heads? Well, this is, in fact, the same as the probability of getting some "tails" (this is so, small review probability theory). You should get some "tails". What is the probability of each of these "tails"? This is 1/2. Those. it should be 1/2 times 1/2, 1/2, 1/2, and 1/2 again. Those. (1/2)⁵. 1⁵=1, divide by 2⁵, i.e. at 32. Quite logical. So... I'll repeat a bit what we went through on the theory of probability. This is important in order to understand where we are now moving and how, in fact, the discrete probability distribution is formed. So, what is the probability that we get heads exactly once? Well, heads might have come up on the first toss. Those. it could be like this: "eagle", "tails", "tails", "tails", "tails". Or heads could come up on the second toss. Those. there could be such a combination: "tails", "heads", "tails", "tails", "tails" and so on. One "eagle" could fall out after any of the 5 tosses. What is the probability of each of these situations? The probability of getting heads is 1/2. Then the probability of getting "tails", equal to 1/2, is multiplied by 1/2, by 1/2, by 1/2. Those. the probability of each of these situations is 1/32. As well as the probability of a situation where X=0. In fact, the probability of any special order of heads and tails will be 1/32. So the probability of this is 1/32. And the probability of this is 1/32. And such situations take place because the “eagle” could fall on any of the 5 tosses. Therefore, the probability that exactly one “eagle” will fall out is equal to 5 * 1/32, i.e. 5/32. Quite logical. Now the interesting begins. What is the probability… (I will write each of the examples in a different color)… what is the probability that my random variable is 2? Those. I will toss a coin 5 times, and what is the probability that it will land exactly heads 2 times? This is more interesting, right? What combinations are possible? It could be heads, heads, tails, tails, tails. It could also be heads, tails, heads, tails, tails. And if you think that these two “eagles” can stand in different places of the combination, then you can get a little confused. You can no longer think about placements the way we did here above. Although ... you can, you only risk getting confused. You must understand one thing. For each of these combinations, the probability is 1/32. ½*½*½*½*½. Those. the probability of each of these combinations is 1/32. And we should think about how many such combinations exist that satisfy our condition (2 "eagles")? Those. in fact, you need to imagine that there are 5 coin tosses, and you need to choose 2 of them, in which the “eagle” falls out. Let's pretend our 5 tosses are in a circle, also imagine we only have two chairs. And we say: “Okay, which one of you will sit on these chairs for the Eagles? Those. which one of you will be the "eagle"? And we are not interested in the order in which they sit down. I give such an example, hoping that it will be clearer to you. And you might want to watch some probability theory tutorials on this topic when I talk about Newton's binomial. Because there I will delve into all this in more detail. But if you reason in this way, you will understand what a binomial coefficient is. Because if you think like this: OK, I have 5 tosses, which toss will land the first heads? Well, here are 5 possibilities of which flip will land the first heads. And how many opportunities for the second "eagle"? Well, the first toss we've already used took away one chance of heads. Those. one head position in the combo is already occupied by one of the tosses. Now there are 4 tosses left, which means that the second "eagle" can fall on one of the 4 tosses. And you saw it, right here. I chose to have heads on the 1st toss, and assumed that on 1 of the 4 remaining tosses, heads should also come up. So there are only 4 possibilities here. All I'm saying is that for the first head you have 5 different positions it can land on. And for the second one, only 4 positions remain. Think about it. When we calculate like this, the order is taken into account. But for us now it doesn’t matter in what order the “heads” and “tails” fall out. We don't say it's "eagle 1" or that it's "eagle 2". In both cases, it's just "eagle". We could assume that this is head 1 and this is head 2. Or it could be the other way around: it could be the second "eagle", and this is the "first". And I say this because it is important to understand where to use placements and where to use combinations. We are not interested in sequence. So, in fact, there are only 2 ways of origin of our event. So let's divide that by 2. And as you'll see later, it's 2! ways of origin of our event. If there were 3 heads, then there would be 3! and I'll show you why. So that would be… 5*4=20 divided by 2 is 10. So there are 10 different combinations out of 32 where you will definitely have 2 heads. So 10*(1/32) is equal to 10/32, what does that equal? 5/16. I will write through the binomial coefficient. This is the value right here at the top. If you think about it, this is the same as 5! divided by ... What does this 5 * 4 mean? 5! is 5*4*3*2*1. Those. if I only need 5 * 4 here, then for this I can divide 5! for 3! This is equal to 5*4*3*2*1 divided by 3*2*1. And only 5 * 4 remains. So it is the same as this numerator. And then, because we are not interested in the sequence, we need 2 here. Actually, 2!. Multiply by 1/32. This would be the probability that we would hit exactly 2 heads. What is the probability that we will get heads exactly 3 times? Those. the probability that x=3. So, by the same logic, the first occurrence of heads may occur in 1 out of 5 flips. The second occurrence of heads may occur on 1 of the 4 remaining tosses. And a third occurrence of heads may occur on 1 of the 3 remaining tosses. How many exist various ways arrange 3 tosses? In general, how many ways are there to arrange 3 objects in their places? It's 3! And you can figure it out, or you might want to revisit the tutorials where I explained it in more detail. But if you take the letters A, B and C, for example, then there are 6 ways in which you can arrange them. You can think of these as headings. Here could be ACB, CAB. Could be BAC, BCA, and... What's the last option that I didn't name? CBA. There are 6 ways to arrange 3 different items. We divide by 6 because we don't want to re-count those 6 different ways because we treat them as equivalent. Here we are not interested in what number of tosses will result in heads. 5*4*3… This can be rewritten as 5!/2!. And divide it by 3 more!. This is what he is. 3! equals 3*2*1. The threes are shrinking. This becomes 2. This becomes 1. Once again, 5*2, i.e. is 10. Each situation has a probability of 1/32, so this is again 5/16. And it's interesting. The probability that you get 3 heads is the same as the probability that you get 2 heads. And the reason for that... Well, there are many reasons why it happened. But if you think about it, the probability of getting 3 heads is the same as the probability of getting 2 tails. And the probability of getting 3 tails should be the same as the probability of getting 2 heads. And it's good that values ​​work like this. Good. What is the probability that X=4? We can use the same formula we used before. It could be 5*4*3*2. So, here we write 5 * 4 * 3 * 2 ... How many different ways are there to arrange 4 objects? It's 4!. 4! - this is, in fact, this part, right here. This is 4*3*2*1. So, this cancels out, leaving 5. Then, each combination has a probability of 1/32. Those. this is equal to 5/32. Again, note that the probability of getting heads 4 times is equal to the probability of heads coming up 1 time. And this makes sense, because. 4 heads is the same as 1 tails. You will say: well, and at what kind of tossing will this one “tails” fall out? Yep, there are 5 different combinations for that. And each of them has a probability of 1/32. And finally, what is the probability that X=5? Those. heads up 5 times in a row. It should be like this: "eagle", "eagle", "eagle", "eagle", "eagle". Each of the heads has a probability of 1/2. You multiply them and get 1/32. You can go the other way. If there are 32 ways in which you can get heads and tails in these experiments, then this is just one of them. Here there were 5 out of 32 such ways. Here - 10 out of 32. Nevertheless, we have carried out the calculations, and now we are ready to draw the probability distribution. But my time is up. Let me continue in the next lesson. And if you are in the mood, then maybe draw before watching the next lesson? See you soon!

Binomial distribution

the probability distribution of the number of occurrences of some event in repeated independent trials. If, for each trial, the probability of an event occurring is R, and 0 ≤ p≤ 1, then the number μ of occurrences of this event for n independent trials, there is a random variable that takes the values m = 1, 2,.., n with probabilities

where q= 1 - p, a - binomial coefficients (hence the name B. r.). The above formula is sometimes called Bernoulli's formula. The mathematical expectation and the variance of the quantity μ, which has a B. R., are equal to M(μ) = np and D(μ) = npq, respectively. At large n, by virtue of Laplace's theorem (See Laplace's theorem), B. r. close to a normal distribution (See Normal distribution), which is what is used in practice. At small n it is necessary to use tables B. r.

Lit.: Bolshev L. N., Smirnov N. V., Tables of mathematical statistics, M., 1965.


Great Soviet Encyclopedia. - M.: Soviet Encyclopedia. 1969-1978 .

See what the "Binomial distribution" is in other dictionaries:

    Probability function ... Wikipedia

    - (binomial distribution) A distribution that allows you to calculate the probability of the occurrence of any random event obtained as a result of observing a number of independent events, if the probability of occurrence of its constituent elementary ... ... Economic dictionary

    - (Bernoulli distribution) the probability distribution of the number of occurrences of some event in repeated independent trials, if the probability of occurrence of this event in each trial is equal to p(0 p 1). Exactly, the number? there are occurrences of this event ... ... Big Encyclopedic Dictionary

    binomial distribution- - Telecommunication topics, basic concepts EN binomial distribution ...

    - (Bernoulli distribution), the probability distribution of the number of occurrences of some event in repeated independent trials, if the probability of occurrence of this event in each trial is p (0≤p≤1). Namely, the number μ of occurrences of this event… … encyclopedic Dictionary

    binomial distribution- 1.49. binomial distribution The probability distribution of a discrete random variable X, taking any integer values ​​from 0 to n, such that for x = 0, 1, 2, ..., n and parameters n = 1, 2, ... and 0< p < 1, где Источник … Dictionary-reference book of terms of normative and technical documentation

    Bernoulli distribution, the probability distribution of a random variable X, taking integer values ​​with probabilities, respectively (binomial coefficient; p parameter B. R., called the probability of a positive outcome, taking the values ​​... Mathematical Encyclopedia

    - (Bernoulli distribution), the probability distribution of the number of occurrences of a certain event in repeated independent trials, if the probability of occurrence of this event in each trial is p (0<или = p < или = 1). Именно, число м появлений … Natural science. encyclopedic Dictionary

    Binomial probability distribution- (binomial distribution) The distribution observed in cases where the outcome of each independent experiment (statistical observation) takes one of two possible values: victory or defeat, inclusion or exclusion, plus or ... Economic and Mathematical Dictionary

    binomial probability distribution- The distribution that is observed in cases where the outcome of each independent experiment (statistical observation) takes one of two possible values: victory or defeat, inclusion or exclusion, plus or minus, 0 or 1. That is ... ... Technical Translator's Handbook

Books

  • Probability Theory and Mathematical Statistics in Problems. More than 360 tasks and exercises, D. A. Borzykh. The proposed manual contains tasks of various levels of complexity. However, the main emphasis is placed on tasks of medium complexity. This is intentionally done to encourage students to…
  • Probability Theory and Mathematical Statistics in Problems: More than 360 Problems and Exercises, Borzykh D. The proposed manual contains problems of various levels of complexity. However, the main emphasis is placed on tasks of medium complexity. This is intentionally done to encourage students to…