• Lognormal distribution functions at. Distribution function of a random variable. Types of distribution. Relationship with other distributions

    You are not a slave!
    Closed educational course for children of the elite: "The true arrangement of the world."
    http://noslave.org

    Material from Wikipedia - the free encyclopedia

    Probability function
    Distribution function
    Designation texvc not found; See math/README for setup help.): \mathrm(Log)(p)
    Options Unable to parse expression (Executable file texvc < p < 1
    Carrier Unable to parse expression (Executable file texvc not found; See math/README for setup help.): k \in \(1,2,3,\dots\)
    Probability function Unable to parse expression (Executable file texvc not found; See math/README for setup help.): \frac(-1)(\ln(1-p)) \; \frac(\;p^k)(k)
    Distribution function Unable to parse expression (Executable file texvc not found; See math/README for help with setup.): 1 + \frac(\Beta_p(k+1,0))(\ln(1-p))
    Expectation Unable to parse expression (Executable file texvc not found; See math/README for setup help.): \frac(-1)(\ln(1-p)) \; \frac(p)(1-p)
    Median
    Fashion Unable to parse expression (Executable file texvc not found; See math/README for setup help.): 1
    Dispersion Unable to parse expression (Executable file texvc not found; See math/README for help on setting up.): -p \;\frac(p + \ln(1-p))((1-p)^2\,\ln^2(1-p))
    Asymmetry coefficient
    Kurtosis coefficient
    Differential entropy
    Generating function of moments Unable to parse expression (Executable file texvc not found; See math/README for setup help.): \frac(\ln(1 - p\,\exp(t)))(\ln(1-p))
    Characteristic function Unable to parse expression (Executable file texvc not found; See math/README for setup help.): \frac(\ln(1 - p\,\exp(i\,t)))(\ln(1-p))

    Logarithmic distribution in probability theory - a class of discrete distributions. The logarithmic distribution is used in a variety of applications, including mathematical genetics and physics.

    Definition

    Let the distribution of a random variable Unable to parse expression (Executable file texvc is given by the probability function:

    Unable to parse expression (Executable file texvc not found; See math/README for help with setup.): p_Y(k) \equiv \mathbb(P)(Y=k) = -\frac(1)(\ln(1-p)) \frac(p^k )(k),\; k=1,2,3,\ldots ,

    Where Unable to parse expression (Executable file texvc not found; See math/README for setup help.): 0

    Then they say that Unable to parse expression (Executable file texvc not found; See math/README for setup help.): Y has a logarithmic distribution with the parameter Unable to parse expression (Executable file texvc not found; See math/README for setup help.):p. They write: Unable to parse expression (Executable file texvc .

    Random Variable Distribution Function Unable to parse expression (Executable file texvc not found; See math/README for setup help.): Y piecewise constant with jumps at natural points:

    Unable to parse expression (Executable file texvc not found; See math/README for setup help.): F_Y(y) = \left\( \begin(matrix) 0, & y< 1 & \\ 1 + \frac{\mathrm{B}_p(k+1,0)}{\ln (1-p)},\; & y \in ,\; 0 , Unable to parse expression (Executable file texvc not found; See math/README for help with setting up.): \sum\limits_(k=1)^(\infty)p_Y(k) = 1 .

    Moments

    Generating function of moments of a random variable Unable to parse expression (Executable file texvc not found; See math/README for setup help.): Y \sim \mathrm(Log)(p) is given by the formula

    Unable to parse expression (Executable file texvc not found; See math/README - help with setup.): M_Y(t) = \frac(\ln\left)(\ln) , Unable to parse expression (Executable file texvc not found; See math/README for setup help.): \mathbb(E)[Y] = - \frac(1)(\ln(1-p)) \frac(p)(1-p) , Unable to parse expression (Executable file texvc not found; See math/README for help with setup.): \mathrm(D)[Y] = -p \;\frac(p + \ln(1-p))((1-p)^2\,\ln ^2(1-p)) .

    Relationship with other distributions

    The Poisson sum of independent logarithmic random variables has a negative binomial distribution. Let Unable to parse expression (Executable file texvc not found; See math/README - help with setup.): \(X_i\)_(i=1)^n a sequence of independent identically distributed random variables such that Unable to parse expression (Executable file texvc not found; See math/README - help with setup.): X_i \sim \mathrm(Log)(p), \; i=1,2,\ldots. Let Unable to parse expression (Executable file texvc not found; See math/README for setup help.): N \sim \mathrm(P)(\lambda)- Poisson random variable. Then

    Unable to parse expression (Executable file texvc not found; See math/README - help with setting up.): Y = \sum\limits_(i=1)^N X_i \sim \mathrm(NB) .

    Applications

    The logarithmic distribution satisfactorily describes the size distribution of asteroids in the solar system [[K:Wikipedia:Articles without sources (country: Lua error: callParserFunction: function "#property" was not found. )]][[K:Wikipedia:Articles without sources (country: Lua error: callParserFunction: function "#property" was not found. )]] .

    90px Probability distributions
    One-dimensional Multidimensional
    Discrete: Bernoulli | Binomial | Geometric | Hypergeometric | Logarithmic| Negative binomial | Poisson | Discrete uniform Multinomial
    Absolutely continuous: Beta | Weibull | Gamma | Hyperexponential | Gompertz distribution | Kolmogorov | Cauchy | Laplace | Lognormal | Normal (Gaussian) | Logistics | Nakagami | Pareto | Pearson | Semicircular | Continuous uniform | Rice | Rayleigh | Student's test | Tracy - Vidoma | Fisher | Chi-square | Exponential | Variance-gamma Multivariate normal | Copula

    .[[K:Wikipedia:Articles without sources (country: Lua error: callParserFunction: function "#property" was not found. )]][[K:Wikipedia:Articles without sources (country: Lua error: callParserFunction: function "#property" was not found. )]][[K:Wikipedia:Articles without sources (country: Lua error: callParserFunction: function "#property" was not found. )]]

    Write a review about the article "Logarithmic distribution"

    An excerpt describing the Logarithmic distribution

    The girl thought deeply about something, then laughed loudly and said cheerfully:
    – It was so funny when I just started “creating”!!! Oh, you would know how funny and amusing it was!.. At the beginning, when everyone “left” me, I was very sad, and I cried a lot... I didn’t know where they were, my mother and my brother. .. I didn’t know anything yet. That’s when, apparently, my grandmother felt sorry for me and she began to teach me a little. And... oh, what happened!.. At first I constantly fell through somewhere, created everything “topsy-turvy” and my grandmother had to watch me almost all the time. And then I learned... It’s even a pity, because now she comes less often... and I’m afraid that maybe someday she won’t come at all...
    For the first time I saw how sad this little lonely girl was sometimes, despite all these amazing worlds she created!.. And no matter how happy and kind she was “from birth,” she was still just a very small, all family of an unexpectedly abandoned child, who was terrified that her only loved one - her grandmother - would also one day leave her...
    - Oh, please don’t think so! – I exclaimed. - She loves you so much! And she will never leave you.
    - No... she said that we all have our own lives, and we must live it the way each of us is destined... It's sad, isn't it?
    But Stella, apparently, simply could not remain in a sad state for a long time, since her face lit up joyfully again, and she asked in a completely different voice:
    - Well, shall we continue watching or have you already forgotten everything?
    - Well, of course we will! – as if I had just woken up from a dream, I answered more readily now.
    I couldn’t yet say with confidence that I even truly understood anything. But it was incredibly interesting, and some of Stella’s actions were already becoming more understandable than they were at the very beginning. The little girl concentrated for a second, and we found ourselves in France again, as if starting from exactly the same moment where we had recently stopped... Again there was the same rich crew and the same beautiful couple who couldn’t think of anything come to an agreement... Finally, completely desperate to prove something to his young and capricious lady, the young man leaned back in the rhythmically swaying seat and said sadly:

    If, however, there are negative or zero terms among them, then you can add some constant to each member of the series, for example, . According to one of the properties of the mathematical expectation, this operation will not change the basic statistical characteristics of the series. This operation allows you to go to the lognormal distribution in the specified case.

    As a result of applying the logarithm operation (36) to the series under study, the spread between the data is significantly reduced. This can be seen from Fig. 9.16: it is obvious that .

    The distribution function of the new series will be equal to

    (37)

    But then

    (38)
    (39)

    And finally

    (40)

    Formulas (37) – (40) give the connection between the lognormal and original distributions.


    Rice. 9.16.

    Poisson distribution law (rare phenomena distribution law)

    With a sufficiently large number of tests, all distributions tend to the normal distribution law. However, if among the data there are rare, exceptional results, then the distributions of these rare phenomena, while the bulk tends to the normal law, tends to another law - the law Poisson distribution. This law is characterized by the fact that with probability either tends to zero. In this case binomial distribution Poisson goes to

    (41)

    Where has the same meaning as in the normal distribution.

    Law Poisson distribution, given by formula (41), describes the probability of events occurring at approximately equal intervals of time, provided that all events occur independently of each other and with some intensity, even very small, but necessarily constant. In this case, the number of tests is large, and the probability of the expected event occurring is very small and equal to . The parameter will then characterize the intensity of the occurrence of the expected event in the sequence of tests.

    In this case, we will try to calculate the expectation.

    A characteristic feature of this type of distribution will be the following mathematical relationships:

    Example 5. 150 samples were collected at the test site. Some of them contained the presence of a rare element:

    Determine the law of distribution of the required element.

    Solution. To answer the question in the problem, you should check the fulfillment of equality (45), which is a characteristic feature Poisson distribution. For simplicity of calculations, we will take not hundredths, but numbers increased by 100 times, i.e.

    Due to the fact that , we conclude that the distribution of the required element obeys the law Poisson distribution. Now, using relations (42), we calculate through the theoretical, compare it with the original frequency, and

    Probability function
    Distribution function
    Designation \mathrm(Log)(p)
    Options 0 < p < 1
    Carrier k \in \(1,2,3,\dots\)
    Probability function \frac(-1)(\ln(1-p)) \; \frac(\;p^k)(k)
    Distribution function 1 + \frac(\Beta_p(k+1,0))(\ln(1-p))
    Expectation \frac(-1)(\ln(1-p)) \; \frac(p)(1-p)
    Median
    Fashion 1
    Dispersion -p \;\frac(p + \ln(1-p))((1-p)^2\,\ln^2(1-p))
    Asymmetry coefficient
    Kurtosis coefficient
    Differential entropy
    Generating function of moments \frac(\ln(1 - p\,\exp(t)))(\ln(1-p))
    Characteristic function \frac(\ln(1 - p\,\exp(i\,t)))(\ln(1-p))

    Logarithmic distribution in probability theory - a class of discrete distributions. The logarithmic distribution is used in a variety of applications, including mathematical genetics and physics.

    Definition

    Let the distribution of a random variable Y is given by the probability function:

    p_Y(k) \equiv \mathbb(P)(Y=k) = -\frac(1)(\ln(1-p)) \frac(p^k)(k),\; k=1,2,3,\ldots,

    Where 0

    Then they say that Y has a logarithmic distribution with the parameter p. They write: Y\sim\mathrm(Log)(p).

    Random Variable Distribution Function Y piecewise constant with jumps at natural points:

    F_Y(y) = \left\(

    \begin(matrix) 0, & y< 1 & \\ 1 + \frac{\mathrm{B}_p(k+1,0)}{\ln (1-p)},\; & y \in ,\; 0

    \sum\limits_(k=1)^(\infty)p_Y(k) = 1.

    Moments

    Generating function of moments of a random variable Y\sim\mathrm(Log)(p) is given by the formula

    M_Y(t) = \frac(\ln\left)(\ln),

    \mathbb(E)[Y] = - \frac(1)(\ln(1-p)) \frac(p)(1-p), \mathrm(D)[Y] = -p \;\frac(p + \ln(1-p))((1-p)^2\,\ln^2(1-p)).

    Relationship with other distributions

    The Poisson sum of independent logarithmic random variables has a negative binomial distribution. Let \(X_i\)_(i=1)^n a sequence of independent identically distributed random variables such that X_i \sim \mathrm(Log)(p), \; i=1,2,\ldots. Let N \sim \mathrm(P)(\lambda)- Poisson random variable. Then

    Y = \sum\limits_(i=1)^N X_i \sim \mathrm(NB).

    Applications

    n Probability distributions
    One-dimensional Multidimensional
    Discrete: Bernoulli | Binomial | Geometric | Hypergeometric | Logarithmic| Negative binomial | Poisson | Discrete uniform Multinomial
    Absolutely continuous: Beta | Weibull | Gamma | Hyperexponential | Gompertz distribution | Kolmogorov | Cauchy | Laplace | Lognormal | Normal (Gaussian) | Logistics | Nakagami | Pareto | Pearson | Semicircular | Continuous uniform | Rice | | Copula

    Write a review about the article "Logarithmic distribution"

    An excerpt describing the Logarithmic distribution

    - Retreat! Everyone retreat! – he shouted from afar. The soldiers laughed. A minute later the adjutant arrived with the same order.
    It was Prince Andrei. The first thing he saw, riding out into the space occupied by Tushin’s guns, was an unharnessed horse with a broken leg, neighing near the harnessed horses. Blood flowed from her leg like from a key. Between the limbers lay several dead. One cannonball after another flew over him as he approached, and he felt a nervous shiver run down his spine. But the very thought that he was afraid raised him up again. “I cannot be afraid,” he thought and slowly dismounted from his horse between the guns. He conveyed the order and did not leave the battery. He decided that he would remove the guns from the position with him and withdraw them. Together with Tushin, walking over the bodies and under terrible fire from the French, he began cleaning up the guns.
    “And then the authorities came just now, so they were tearing up,” the fireworksman said to Prince Andrei, “not like your honor.”
    Prince Andrei did not say anything to Tushin. They were both so busy that it seemed they didn’t even see each other. When, having put the surviving two of the four guns on the limbers, they moved down the mountain (one broken cannon and the unicorn were left), Prince Andrei drove up to Tushin.
    “Well, goodbye,” said Prince Andrei, extending his hand to Tushin.
    “Goodbye, my dear,” said Tushin, “dear soul!” “goodbye, my dear,” said Tushin with tears that, for some unknown reason, suddenly appeared in his eyes.

    The wind died down, black clouds hung low over the battlefield, merging on the horizon with gunpowder smoke. It was getting dark, and the glow of fires was all the more clearly visible in two places. The cannonade became weaker, but the crackling of guns behind and to the right was heard even more often and closer. As soon as Tushin with his guns, driving around and running over the wounded, came out from under fire and went down into the ravine, he was met by his superiors and adjutants, including a staff officer and Zherkov, who was sent twice and never reached Tushin’s battery. All of them, interrupting one another, gave and passed on orders on how and where to go, and made reproaches and comments to him. Tushin did not give orders and silently, afraid to speak, because at every word he was ready, without knowing why, to cry, he rode behind on his artillery nag. Although the wounded were ordered to be abandoned, many of them trailed behind the troops and asked to be deployed to the guns. The same dashing infantry officer who jumped out of Tushin’s hut before the battle was, with a bullet in his stomach, placed on Matvevna’s carriage. Under the mountain, a pale hussar cadet, supporting the other with one hand, approached Tushin and asked to sit down.

    The lognormal distribution function has found wide application in analyzing the reliability of objects in technology, biology, economics, etc. For example, the function is successfully used to describe the time to failure of bearings, electronic devices and other products.

    Non-negative random values ​​of some parameter are lognormally distributed if its logarithm is normally distributed. The distribution density for different values ​​of σ is shown in Fig. 4.3.

    Rice. 4.3.

    The distribution density is described by the dependence

    Where M x and σ – parameters estimated from the results n tests to failure:

    (4.4)

    For a lognormal distribution law, the reliability function

    (4.5)

    The probability of failure-free operation can be determined from tables for normal distribution (see Table A6.1 of Appendix 6) depending on the quantile value

    Mathematical expectation of time to failure

    The standard deviation and coefficient of variation, respectively, will be equal

    If v x 0.3, then it is believed that ν x = σ, and the error is no more than 1%.

    Often used to write dependencies for the lognormal distribution law in decimal logarithms. In accordance with this law, the distribution density

    Estimates of lg parameters x 0 and σ are determined based on test results:

    Expectation M x, standard deviation σ x and coefficient of variation ν x times to failure are respectively equal

    Example 4.6

    Determine the probability of failure-free operation of the gearbox during t= 103 hours, if the resource is distributed logarithmically with parameters lg t 0 = 3.6; σ = 0.3.

    Solution

    Let's find the quantile value and determine the probability of failure-free operation:

    Answer: R(t) = 0,0228.

    Weibull distribution

    The Weibull distribution function is a two-parameter distribution. The law it describes is universal, since with appropriate values ​​of the parameters it turns into normal, exponential and other types of distributions. The author of this distribution law, V. Weibull, used it to describe and analyze experimentally observed variations in the fatigue strength of steel and its elastic limits. Weibull's law satisfactorily describes the time to failure of bearings and elements of electronic equipment; it is used to assess the reliability of parts and assemblies of machines, including cars, as well as to assess the reliability of machines during their running-in process. The distribution density is described by the dependence

    where α is the distribution curve shape parameter; λ – distribution curve scale parameter.

    The graph of the distribution density function is shown in Fig. 4.4.

    Rice. 4.4.

    Weibull distribution function

    Reliability function for this distribution law

    Expectation of a random variable X equals

    where Г( x) – gamma function.

    For continuous values X

    For integer values X The gamma function is calculated using the formula

    the formulas are also correct

    The variance of the random variable is equal to

    The widespread use of the Weibull distribution law in the analysis and calculations of product reliability is explained by the fact that this law, generalizing the exponential distribution, contains an additional parameter α.

    By properly selecting the parameters a and λ, it is possible to obtain a better agreement between the calculated values ​​and experimental data compared to the exponential law, which is one-parameter (parameter λ).

    Thus, for products that have hidden defects, but which are not used for a long time (and therefore age more slowly), the risk of failure is greatest in the initial period, and then quickly decreases. The reliability function for such a product is well described by the Weibull law with parameter α< 1.

    On the contrary, if the product is well controlled during manufacturing and has almost no hidden defects, but undergoes rapid aging, then the reliability function is described by the Weibull law with parameter α > 1. At α = 3.3, the Weibull distribution is close to normal.

    A random variable is called lognormally distributed if its logarithm obeys the normal distribution law.

    This means, in particular, that the values ​​of a log-normal random variable are formed under the influence of a very large number of mutually independent factors, and the influence of each individual factor is “uniformly insignificant” and equally probable in sign. Moreover, in contrast to the scheme of formation of the normal law mechanism, the sequential nature of the influence of random factors is such that the random increase caused by the action of each subsequent factor is proportional to the value of the value being studied that has already been achieved by that moment (in this case they speak of the multiplicative nature of the factor’s influence). Mathematically, what has been said can be formalized as follows. If - is a non-random component of the studied characteristic (i.e., as it were, the “true” value in an idealized scheme, when the influence of all random factors is eliminated), - is a numerical expression of the effects of the influence of the random factors mentioned above, then the values ​​of the studied characteristic, successively transformed by the action of these factors, will be:

    It's easy to get from here

    Where . But the right-hand side of (6.11) is the result of the additive action of many random factors, which, under the assumptions made above, should lead, as we know (see section 6.1.5, as well as § 7.3, dedicated to the central limit theorem), to the normal distribution of this sum .

    At the same time, taking into account the sufficiently large number of random terms (i.e., setting ) and the relative insignificance of the influence of each of them (i.e., setting ), it is possible to move from the sum on the left side of (6.11) to the integral

    This. and ultimately means that the logarithm of the quantity we are interested in (reduced by a constant value) obeys the normal law with a zero mean value, i.e.

    whence by differentiating with respect to x the left and right sides of this relation we obtain

    (the validity of the identity used in the calculation follows from the strict monotonicity of the transformation

    The described scheme for generating the values ​​of a logarithmically normal random variable turns out to be characteristic of many specific physical and socio-economic situations (size and weight of particles formed during crushing; employee wages; family income; sizes of space formations; durability of a product operating in wear and aging mode and others; see, for example, , , ).

    Example 6.1. The per capita monthly income (in dollars) of a family of a certain set of families is considered as a random variable. N=750 families were examined.

    Table 6.1

    Table 6.2

    In table 6.1 and 6.2 show the results of grouping the sample data and their logarithms, respectively (the width of the grouping interval is 25 dollars). In Fig. 6.1, a, b show histograms and densities of the log-normal and normal distribution laws, respectively.

    Rice. 6 1. Histogram and theoretical (model) density characterizing the distribution of families by average per capita monthly income (a) and by the logarithm of average per capita monthly income (b)

    Below are the results of calculating the main numerical characteristics of the log-normal distribution (in terms of the law parameters a and ):

    From these expressions it is clear that the skewness and kurtosis of the log-normal distribution are always positive (and the closer to zero, the closer to zero), and the mode, median and mean are arranged exactly in the order that we see in Fig. 5.8, and they will tend to merge (and the density curve - to symmetry) as the quantity tends to zero. In this case, although the values ​​of a log-normal random variable are formed as “random distortions” of some “true value” a, the latter ultimately acts not as an average, but as a median.