Back

ⓘ Relationships among probability distributions. In probability theory and statistics, there are several relationships among probability distributions. These rela ..




Relationships among probability distributions
                                     

ⓘ Relationships among probability distributions

In probability theory and statistics, there are several relationships among probability distributions. These relations can be categorized in the following groups:

  • Transforms function of a random variable;
  • Duality;
  • Approximation limit relationships;
  • Compound relationships useful for Bayesian inference;
  • Combinations function of several variables;
  • Conjugate priors.
  • One distribution is a special case of another with a broader parameter space
                                     

1. Special case of distribution parametrization

  • A beta-binomial n, 1, 1 random variable is a discrete uniform random variable over the values 0., n.
  • A beta random variable with parameters α = β = 1 is a uniform random variable.
  • A chi-squared distribution with 2 degrees of freedom is an exponential distribution with mean 2 and vice versa.
  • A negative binomial distribution with n = 1 is a geometric distribution.
  • A Weibull 1, β random variable is an exponential random variable with mean β.
  • A binomial n, p random variable with n = 1, is a Bernoulli p random variable.
  • A random variable with a t distribution with one degree of freedom is a Cauchy0.1 random variable.
  • A gamma α, β random variable with α = ν /2 and β = 1/2, is a chi-squared random variable with ν degrees of freedom.
  • A gamma distribution with shape parameter α = 1 and scale parameter θ is an exponential distribution with expected value θ.
                                     

2.1. Transform of a variable Multiple of a random variable

Multiplying the variable by any positive real constant yields a scaling of the original distribution. Some are self-replicating, meaning that the scaling yields the same family of distributions, albeit with a different parameter: normal distribution, gamma distribution, Cauchy distribution, exponential distribution, Erlang distribution, Weibull distribution, logistic distribution, error distribution, power-law distribution, Rayleigh distribution.

Example:

  • If X is a gamma random variable with shape and rate parameters r, λ, then Y = aX is a gamma random variable with parameters r, λ / a.
  • If X is a gamma random variable with shape and scale parameters α, β, then Y = aX is a gamma random variable with parameters α, aβ.
                                     

2.2. Transform of a variable Linear function of a random variable

The affine transform ax + b yields a relocation and scaling of the original distribution. The following are self-replicating: Normal distribution, Cauchy distribution, Logistic distribution, Error distribution, Power distribution, Rayleigh distribution.

Example:

  • If Z is a normal random variable with parameters μ = m, σ 2 = s 2, then X = aZ + b is a normal random variable with parameters μ = am + b, σ 2 = a 2 s 2.
                                     

2.3. Transform of a variable Reciprocal of a random variable

The reciprocal 1/ X of a random variable X, is a member of the same family of distribution as X, in the following cases: Cauchy distribution, F distribution, log logistic distribution.

Examples:

  • If X is an F ν 1, ν 2 random variable then 1/ X is an F ν 2, ν 1 random variable.
  • If X is a Cauchy μ, σ random variable, then 1/ X is a Cauchy μ / C, σ / C random variable where C = μ 2 + σ 2.
                                     

2.4. Transform of a variable Other cases

Some distributions are invariant under a specific transformation.

Example:

  • If X is a binomial n, p random variable then n − X is a binomial n, 1 − p random variable.
  • If X is a normal μ, σ 2 random variable then e X is a lognormal μ, σ 2 random variable.
  • If X has cumulative distribution function F X, then the inverse of the cumulative distribution F X is a standard uniform 0.1 random variable
  • If X is a beta α, β random variable then 1 − X is a beta β, α random variable.
Conversely, if X is a lognormal μ, σ 2 random variable then log X is a normal μ, σ 2 random variable.
  • If X is a double exponential random variable with mean 0 and scale λ, then | X | is an exponential random variable with mean λ.
  • If X is a Student’s t random variable with ν degree of freedom, then X 2 is an F 1, ν random variable.
  • The square of a standard normal random variable has a chi-squared distribution with one degree of freedom.
  • A geometric random variable is the floor of an exponential random variable.
  • If X is an exponential random variable with mean β, then X 1/ γ is a Weibull γ, β random variable.
  • A reciprocal random variable is the exponential of a uniform random variable.
  • A rectangular random variable is the floor of a uniform random variable.


                                     

3.1. Functions of several variables Sum of variables

The distribution of the sum of independent random variables is the convolution of their distributions. Suppose Z {\displaystyle Z} is the sum of n {\displaystyle n} independent random variables X 1, …, X n {\displaystyle X_{1},\dots,X_{n}} each with probability mass functions f X i x {\displaystyle f_{X_{i}}x}. Then

has

If it has a distribution from the same family of distributions as the original variables, that family of distributions is said to be closed under convolution.

Examples of such univariate distributions are: normal distributions, Poisson distributions, binomial distributions with common success probability, negative binomial distributions with common success probability, gamma distributions with common rate parameter, chi-squared distributions, Cauchy distributions, hyperexponential distributions.

Examples:

  • If X 1 is a Cauchy μ 1, σ 1 random variable and X 2 is a Cauchy μ 2, σ 2, then X 1 + X 2 is a Cauchy μ 1 + μ 2, σ 1 + σ 2 random variable.
  • If X 1 and X 2 are Poisson random variables with means μ 1 and μ 2 respectively, then X 1 + X 2 is a Poisson random variable with mean μ 1 + μ 2.
  • The sum of N chi-squared 1 random variables has a chi-squared distribution with N degrees of freedom.
  • If X 1 and X 2 are chi-squared random variables with ν 1 and ν 2 degrees of freedom respectively, then X 1 + X 2 is a chi-squared random variable with ν 1 + ν 2 degrees of freedom.
  • The sum of gamma n i, β random variables has a gamma Σ n i, β distribution.
  • If X 1 is a normal μ 1, σ 2 1 random variable and X 2 is a normal μ 2, σ 2 random variable, then X 1 + X 2 is a normal μ 1 + μ 2, σ 2 1 + σ 2 random variable.

Other distributions are not closed under convolution, but their sum has a known distribution:

  • The sum of n Bernoulli p random variables is a binomial n, p random variable.
  • The sum of n geometric random variable with probability of success p is a negative binomial random variable with parameters n and p.
  • If the exponential random variables have a common rate parameter, their sum has an Erlang distribution, a special case of the gamma distribution.
  • The sum of n exponential β random variables is a gamma n, β random variable.
  • The sum of the squares of N standard normal random variables has a chi-squared distribution with N degrees of freedom.


                                     

3.2. Functions of several variables Product of variables

The product of independent random variables X and Y may belong to the same family of distribution as X and Y: Bernoulli distribution and log-normal distribution.

Example:

  • If X 1 and X 2 are independent log-normal random variables with parameters μ 1, σ 2 1 and μ 2, σ 2 respectively, then X 1 X 2 is a log-normal random variable with parameters μ 1 + μ 2, σ 2 1 + σ 2 2.

See also Product distribution.

                                     

3.3. Functions of several variables Minimum and maximum of independent random variables

For some distributions, the minimum value of several independent random variables is a member of the same family, with different parameters: Bernoulli distribution, Geometric distribution, Exponential distribution, Extreme value distribution, Pareto distribution, Rayleigh distribution, Weibull distribution.

Examples:

  • If X 1 and X 2 are independent geometric random variables with probability of success p 1 and p 2 respectively, then minX 1, X 2 is a geometric random variable with probability of success p = p 1 + p 2 − p 1 p 2. The relationship is simpler if expressed in terms probability of failure: q = q 1 q 2.
  • If X 1 and X 2 are independent exponential random variables with rate μ 1 and μ 2 respectively, then minX 1, X 2 is an exponential random variable with rate μ = μ 1 + μ 2.

Similarly, distributions for which the maximum value of several independent random variables is a member of the same family of distribution include: Bernoulli distribution, Power law distribution.

                                     

3.4. Functions of several variables Other

  • If X and Y are independent exponential random variables with mean μ, then X − Y is a double exponential random variable with mean 0 and scale μ.
  • If X 1 is a gamma α 1, 1 random variable and X 2 is an independent gamma α 2, 1 random variable then X 1 /X 1 + X 2 is a beta α 1, α 2 random variable. More generally, if X 1 is a gammaα 1, β 1 random variable and X 2 is an independent gammaα 2, β 2 random variable then β 2 X 1 /β 2 X 1 + β 1 X 2 is a betaα 1, α 2 random variable.
  • If X and Y are independent standard normal random variables, X / Y is a Cauchy 0.1 random variable.
  • If X 1 and X 2 are independent chi-squared random variables with ν 1 and ν 2 degrees of freedom respectively, then X 1 / ν 1/X 2 / ν 2 is an F ν 1, ν 2 random variable.
  • If X is a standard normal random variable and U is an independent chi-squared random variable with ν degrees of freedom, then X U / ν {\displaystyle {\frac {X}{\sqrt {U/\nu}}}} is a Students t ν random variable.

See also ratio distribution.

                                     

4. Approximate limit relationships

Approximate or limit relationship means

  • or that the limit when a parameter tends to some value approaches to a different distribution.
  • either that the combination of an infinite number of iid random variables tends to some distribution,

Combination of iid random variables:

  • Given certain conditions, the sum hence the average of a sufficiently large number of iid random variables, each with finite mean and variance, will be approximately normally distributed. This is the central limit theorem CLT.

Special case of distribution parametrization:

  • X is a beta-binomial random variable with parameters n, α, β. Let p = /α + β and suppose α + β is large, then X approximately has a binomial n, p distribution.
  • X is a hypergeometric m, N, n random variable. If n and m are large compared to N, and p = m / N is not close to 0 or 1, then X approximately has a Binomial n, p distribution.
  • If X is a binomial n, p random variable and if n is large and np is small then X approximately has a Poisson np distribution.
  • If X is a negative binomial random variable with r large, P near 1, and r 1 − P = λ, then X approximately has a Poisson distribution with mean λ.

Consequences of the CLT:

  • If X is a Students t random variable with a large number of degrees of freedom ν then X approximately has a standard normal distribution.
  • If X is a binomial n, p random variable with large np and n 1 − p, then for integers j and k, Pj ≤ X ≤ k approximately equals to Pj − 1/2 ≤ Y ≤ k + 1/2 where Y is a normal random variable with the same mean and variance as X, i.e. np and np 1 − p.
  • If X is a beta random variable with parameters α and β equal and large, then X approximately has a normal distribution with the same mean and variance, i. e. mean /α + β and variance αβ /α + β 2 α + β + 1).
  • If X is an F ν, ω random variable with ω large, then νX is approximately distributed as a chi-squared random variable with ν degrees of freedom.
  • If X is a gamma α, β random variable and the shape parameter α is large relative to the scale parameter β, then X approximately has a normal random variable with the same mean and variance.
  • If X is a Poisson random variable with large mean, then for integers j and k, Pj ≤ X ≤ k approximately equals to P j − 1/2 ≤ Y ≤ k + 1/2 where Y is a normal distribution with the same mean and variance as X.


                                     

5. Compound or Bayesian relationships

When one or more parameters of a distribution are random variables, the compound distribution is the marginal distribution of the variable.

Examples:

  • If X | N is a binomial N, p random variable, where parameter N is a random variable with Poissonμ distribution, then X is distributed as a Poisson μp.
  • If X | μ is a Poisson μ random variable and parameter μ is random variable with gammam, θ distribution where θ is the scale parameter, then X is distributed as a negative-binomial m, θ /1 + θ), sometimes called gamma-Poisson distribution.
  • If X | N is a binomial N, p random variable, where parameter N is a random variable with negative-binomial m, r distribution, then X is distributed as a negative-binomial m, r /p + qr).

Some distributions have been specially named as compounds: beta-binomial distribution, beta-Pascal distribution, gamma-normal distribution.

Examples:

  • If X is a negative-binomialm, p random variable, and parameter p is a random variable with betaα, β distribution, then X is distributed as a Beta-Pascalα, β, m.
  • If X is a Binomialn, p random variable, and parameter p is a random variable with betaα, β distribution, then X is distributed as a Beta-Binomialα, β, n.
                                     
  • variables Relationships among probability distributions Infinite divisibility probability Stable distribution Product distribution Mixture distribution Sum
  • Many probability distributions that are important in theory or applications have been given specific names. The Bernoulli distribution which takes value
  • entropy probability distribution has entropy that is at least as great as that of all other members of a specified class of probability distributions According
  • exponential distribution is not the same as the class of exponential families of distributions which is a large class of probability distributions that includes
  • see List of probability distributions probability measure The probability of events in a probability space probability plot probability space A sample
  • probability theory and statistics, the continuous uniform distribution or rectangular distribution is a family of symmetric probability distributions
  • ontology of probability distributions ProbOnto 2.5 released on January 16, 2017 contains over 150 uni - and multivariate distributions and alternative
  • that the expected probability of seeing a category i among the various discrete distributions generated by the posterior distribution is simply equal to
  • a Boltzmann distribution also called Gibbs distribution is a probability distribution or probability measure that gives the probability that a system
  • theory Reification statistics Rejection sampling Relationships among probability distributions Relative change and difference Relative efficiency 
  • In probability theory and statistics, the Dirichlet - multinomial distribution is a family of discrete multivariate probability distributions on a finite
  • interpreted as an attempt to perform inverse probability without calling on prior probability distributions Fiducial inference quickly attracted controversy