In probability theory, an indecomposable distribution is a probability distribution that cannot be represented as the distribution of the sum of two or more non-constant independent random variables: Z ≠ X + Y. If it can be so expressed, it is decomposable: Z = X + Y. If, further, it can be expressed as the distribution of the sum of two or more independent identically distributed random variables, then it is divisible: Z = X1 + X2.
Examples
editIndecomposable
edit- The simplest examples are Bernoulli-distributions: if
- then the probability distribution of X is indecomposable.
- Proof: Given non-constant distributions U and V, so that U assumes at least two values a, b and V assumes two values c, d, with a < b and c < d, then U + V assumes at least three distinct values: a + c, a + d, b + d (b + c may be equal to a + d, for example if one uses 0, 1 and 0, 1). Thus the sum of non-constant distributions assumes at least three values, so the Bernoulli distribution is not the sum of non-constant distributions.
- Suppose a + b + c = 1, a, b, c ≥ 0, and
- This probability distribution is decomposable (as the distribution of the sum of two Bernoulli-distributed random variables) if
- and otherwise indecomposable. To see, this, suppose U and V are independent random variables and U + V has this probability distribution. Then we must have
- for some p, q ∈ [0, 1], by similar reasoning to the Bernoulli case (otherwise the sum U + V will assume more than three values). It follows that
- This system of two quadratic equations in two variables p and q has a solution (p, q) ∈ [0, 1]2 if and only if
- Thus, for example, the discrete uniform distribution on the set {0, 1, 2} is indecomposable, but the binomial distribution for two trials each having probabilities 1/2, thus giving respective probabilities a, b, c as 1/4, 1/2, 1/4, is decomposable.
- An absolutely continuous indecomposable distribution. It can be shown that the distribution whose density function is
- is indecomposable.
Decomposable
edit- All infinitely divisible distributions are a fortiori decomposable; in particular, this includes the stable distributions, such as the normal distribution.
- The uniform distribution on the interval [0, 1] is decomposable, since it is the sum of the Bernoulli variable that assumes 0 or 1/2 with equal probabilities and the uniform distribution on [0, 1/2]. Iterating this yields the infinite decomposition:
- where the independent random variables Xn are each equal to 0 or 1 with equal probabilities – this is a Bernoulli trial of each digit of the binary expansion.
- A sum of indecomposable random variables is decomposable into the original summands. But it may turn out to be infinitely divisible. Suppose a random variable Y has a geometric distribution
- on {0, 1, 2, ...}.
- For any positive integer k, there is a sequence of negative-binomially distributed random variables Yj, j = 1, ..., k, such that Y1 + ... + Yk has this geometric distribution.[citation needed] Therefore, this distribution is infinitely divisible.
- On the other hand, let Dn be the nth binary digit of Y, for n ≥ 0. Then the Dn's are independent[why?] and
- and each term in this sum is indecomposable.
Related concepts
editAt the other extreme from indecomposability is infinite divisibility.
- Cramér's theorem shows that while the normal distribution is infinitely divisible, it can only be decomposed into normal distributions.
- Cochran's theorem shows that the terms in a decomposition of a sum of squares of normal random variables into sums of squares of linear combinations of these variables always have independent chi-squared distributions.
See also
editReferences
edit- Linnik, Yu. V. and Ostrovskii, I. V. Decomposition of random variables and vectors, Amer. Math. Soc., Providence RI, 1977.
- Lukacs, Eugene, Characteristic Functions, New York, Hafner Publishing Company, 1970.