# ⓘ Normally distributed and uncorrelated does not imply independent. In probability theory, although simple examples illustrate that linear uncorrelatedness of two ..

## ⓘ Normally distributed and uncorrelated does not imply independent

In probability theory, although simple examples illustrate that linear uncorrelatedness of two random variables does not in general imply their independence, it is sometimes mistakenly thought that it does imply that when the two random variables are normally distributed. This article demonstrates that assumption of normal distributions does not have that consequence, although the multivariate normal distribution, including the bivariate normal distribution, does.

To say that the pair X, Y {\displaystyle X,Y} of random variables has a bivariate normal distribution means that every linear combination a X + b Y {\displaystyle aX+bY} of X {\displaystyle X} and Y {\displaystyle Y} for constant i.e. not random coefficients a {\displaystyle a} and b {\displaystyle b} has a univariate normal distribution. In that case, if X {\displaystyle X} and Y {\displaystyle Y} are uncorrelated then they are independent. However, it is possible for two random variables X {\displaystyle X} and Y {\displaystyle Y} to be so distributed jointly that each one alone is marginally normally distributed, and they are uncorrelated, but they are not independent; examples are given below.

### 1.1. Examples A symmetric example

Suppose X {\displaystyle X} has a normal distribution with expected value 0 and variance 1. Let W {\displaystyle W} have the Rademacher distribution, so that W = 1 {\displaystyle W=1} or W = − 1 {\displaystyle W=-1}, each with probability 1/2, and assume W {\displaystyle W} is independent of X {\displaystyle X}. Let Y = W X {\displaystyle Y=WX}. Then

• both have the same normal distribution; and
• X {\displaystyle X} and Y {\displaystyle Y} are uncorrelated;
• X {\displaystyle X} and Y {\displaystyle Y} are not independent.

To see that X {\displaystyle X} and Y {\displaystyle Y} are uncorrelated, one may consider the covariance cov ⁡ X, Y {\displaystyle \operatorname {cov} X,Y}: by definition, it is

cov ⁡ X, Y = E ⁡ X Y − E ⁡ X E ⁡ Y. {\displaystyle \operatorname {cov} X,Y=\operatorname {E} XY-\operatorname {E} X\operatorname {E} Y.}

Then by definition of the random variables X {\displaystyle X}, Y {\displaystyle Y}, and W {\displaystyle W}, and the independence of W {\displaystyle W} from X {\displaystyle X}, one has

cov ⁡ X, Y = E ⁡ X Y − 0 = E ⁡ X 2 W = E ⁡ X 2 E ⁡ W = E ⁡ X 2 ⋅ 0 = 0. {\displaystyle \operatorname {cov} X,Y=\operatorname {E} XY-0=\operatorname {E} X^{2}W=\operatorname {E} X^{2}\operatorname {E} W=\operatorname {E} X^{2}\cdot 0=0.}

To see that Y {\displaystyle Y} has the same normal distribution as X {\displaystyle X}, consider

Pr Y ≤ x = E ⁡ Pr Y ≤ x ∣ W) = Pr X ≤ x Pr W = 1 + Pr − X ≤ x Pr W = − 1 = Φ x ⋅ 1 2 + Φ x ⋅ 1 2 {\displaystyle {\begin{aligned}\PrY\leq x&{}=\operatorname {E} \PrY\leq x\mid W)\\&{}=\PrX\leq x\PrW=1+\Pr-X\leq x\PrW=-1\\&{}=\Phi x\cdot {\frac {1}{2}}+\Phi x\cdot {\frac {1}{2}}\end{aligned}}}

since X {\displaystyle X} and − X {\displaystyle -X} both have the same normal distribution, where Φ x {\displaystyle \Phi x} is the cumulative distribution function of the normal distribution.

To see that X {\displaystyle X} and Y {\displaystyle Y} are not independent, observe that | Y | = | X | {\displaystyle |Y|=|X|} or that Pr ⁡ Y > 1 | X = 1 / 2 = Pr ⁡ X > 1 | X = 1 / 2 = 0 {\displaystyle \operatorname {Pr} Y> 1|X=1/2=\operatorname {Pr} X> 1|X=1/2=0}.

Finally, the distribution of the simple linear combination X + Y {\displaystyle X+Y} concentrates positive probability at 0: Pr ⁡ X + Y = 0 = 1 / 2 {\displaystyle \operatorname {Pr} X+Y=0=1/2}. Therefore, the random variable X + Y {\displaystyle X+Y} is not normally distributed, and so also X {\displaystyle X} and Y {\displaystyle Y} are not jointly normally distributed by the definition above.

### 1.2. Examples An asymmetric example

Suppose X {\displaystyle X} has a normal distribution with expected value 0 and variance 1. Let

Y = { X if | X | ≤ c − X if | X | > c {\displaystyle Y=\left\ ; the distribution has support everywhere except along the axes and has a discontinuity at the origin: the density diverges when the origin is approached along any straight path except along the axes.

• normal and uncorrelated but whose joint distribution is not joint normal see Normally distributed and uncorrelated does not imply independent Two random
• is not analytic Leech lattice Lewy s example on PDEs List of finite simple groups Long line Normally distributed and uncorrelated does not imply independent
• used to show that normally distributed and uncorrelated does not imply independent Random vectors with components sampled independently from the Rademacher
• fats Design of experiments Joint effect Normally distributed and uncorrelated does not imply independent Pirates and global warming Reproducibility Spurious
• Mutual information Kullback Leibler divergence Normally distributed and uncorrelated does not imply independent Le Cam s theorem Large deviations theory Contraction
• variables are uncorrelated and individually normally distributed but they are not jointly normally distributed and are not independent If x is rotated
• Subindependence Conditional independence Normally distributed and uncorrelated does not imply independent Mean dependence Russell, Stuart Norvig, Peter 2002
• presence of a correlation is not sufficient to infer the presence of a causal relationship i.e., correlation does not imply causation Formally, random
• which uses the normal distribution as a kernel Normally distributed and uncorrelated does not imply independent Reciprocal normal distribution Ratio normal
• above, it is not true that two random variables that are separately, marginally normally distributed and uncorrelated are independent If N - dimensional
• are uncorrelated that does not in general imply that they are independent However, if two variables are jointly normally distributed but not if they

#### Users also searched:

correlation of two random variables is in range,

...
 ...
...