In section 2 of Chapter 3 we defined the notion of a series of independent trials. In this section we define the notion of independent random variables. This notion plays the same role in the theory of jointly distributed random variables that the notion of independent trials plays in the theory of sample description spaces consisting of \(n\) trials. We consider first the case of two jointly distributed random variables.
Let \(X_{1}\) and \(X_{2}\) be jointly distributed random variables, with individual distribution functions \(F_{X_{1}}(\cdot)\) and \(F_{X_{2}}(\cdot)\) , respectively, and joint distribution function \(F_{X_{1}, X_{2}}(.,.)\) . We say that the random variables \(X_{1}\) and \(X_{2}\) are independent if for any two Borel sets of real numbers \(B_{1}\) and \(B_{2}\) the events [ \(X_{1}\) is in \(B_{1}\) ] and [ \(X_{2}\) is in \(B_{2}\) ] are independent; that is, \[P\left[X_{1}\right. \text{is in } B_{1} \text{and } X_{2} \text{is in } \left.B_{2}\right]=P\left[X_{1}\right. \text{is in } \left.B_{1}\right] P\left[X_{2}\right. \text{is in } \left.B_{2}\right] \tag{6.1}\]
The foregoing definition may be expressed equivalently: the random variables \(X_{1}\) and \(X_{2}\) are independent if for any event \(A_{1}\) , depending only on the random variable \(X_{1}\) , and any event \(A_{2}\) , depending only on the random variable \(X_{2}, P\left[A_{1} A_{2}\right]=P\left[A_{1}\right] P\left[A_{2}\right]\) , so that the events \(A_{1}\) and \(A_{2}\) are independent.
It may be shown that if (6.1) holds for sets \(B_{1}\) and \(B_{2}\) , which are infinitely extended intervals of the form \(B_{1}=\left\{x_{1}^{\prime}: x_{1}^{\prime} \leq x_{1}\right\}\) and \(B_{2}=\left\{x_{2}^{\prime}: \quad x_{2}^{\prime} \leq x_{2}\right\}\) , for any real numbers \(x_{1}\) and \(x_{2}\) , then (6.1) holds for any Borel sets \(B_{1}\) and \(B_{2}\) of real numbers. We therefore have the following equivalent formulation of the notion of the independence of two jointly distributed random variables \(X_{1}\) and \(X_{2}\) .
Two jointly distributed random variables, \(X_{1}\) and \(X_{2}\) are independent if their joint distribution function \(F_{X_{1}, X_{2}}(.,.)\) may be written as the product of their individual distribution functions \(F_{X_{1}}(\cdot)\) and \(F_{X_{2}}(\cdot)\) in the sense that, for any real numbers \(x_{1}\) and \(x_{2}\) ,
\[F_{X_{1}, X_{2}}\left(x_{1}, x_{2}\right)=F_{X_{1}}\left(x_{1}\right) F_{X_{2}}\left(x_{2}\right). \tag{6.2}\]
Similarly, two jointly continuous random variables, \(X_{1}\) and \(X_{2}\) are independent if their joint probability density function \(f_{X_{1}, X_{2}}(.,.)\) may be written as the product of their individual probability density functions \(f_{X_{1}}(\cdot)\) and \(f_{X_{2}}(\cdot)\) in the sense that, for any real numbers \(x_{1}\) and \(x_{2}\) ,
\[f_{X_{1}, X_{2}}\left(x_{1}, x_{2}\right)=f_{X_{1}}\left(x_{1}\right) f_{X_{2}}\left(x_{2}\right). \tag{6.3}\]
Equation (6.3) follows from (6.2) by differentiating both sides of (6.2) first with respect to \(x_{1}\) and then with respect to \(x_{2}\) . Equation (6.2) follows from (6.3) by integrating both sides of (6.3).
Similarly, two jointly discrete random variables, \(X_{1}\) and \(X_{2}\) are independent if their joint probability mass function \(p_{X_{1}, X_{2}}(.,.)\) may be written as the product of their individual probability mass functions \(p_{X_{1}}(\cdot)\) and \(p_{x_{2}}(\cdot)\) in the sense that, for all real numbers \(x_{1}\) and \(x_{2}\) ,
\[p_{X_{1}, X_{2}}\left(x_{1}, x_{2}\right)=p_{X_{1}}\left(x_{1}\right) p_{X_{2}}\left(x_{2}\right). \tag{6.4}\]
Two random variables \(X_{1}\) and \(X_{2}\) , which do not satisfy any of the foregoing relations, are said to be dependent or nonindependent.
Example 6A . Independent and dependent random variables . In example 5A the random variables \(X_{1}\) and \(X_{2}\) are independent in the case of sampling with replacement but are dependent in the case of sampling without replacement. In either case, the random variables \(X_{1}\) and \(X_{2}\) are identically distributed. In example 5B the random variables \(X_{1}\) and \(X_{2}\) are independent and identically distributed. It may be seen from the definitions given at the end of the section that the random variables \(X_{1}, X_{2}, \ldots, X_{5}\) considered in example 5C are independent and identically distributed.
Independent random variables have the following exceedingly important property:
Theorem 6A . Let the random variables \(Y_{1}\) and \(Y_{2}\) be obtained from the random variables \(X_{1}\) and \(X_{2}\) by some functional transformation, so that \(Y_{1}=g_{1}\left(X_{1}\right)\) and \(Y_{2}=g_{2}\left(X_{2}\right)\) for some Borel functions \(g_{1}(\cdot)\) and \(g_{2}(\cdot)\) of a real variable. Independence of the random variables \(X_{1}\) and \(X_{2}\) implies independence of the random variables \(Y_{1}\) and \(Y_{2}\) .
This assertion is proved as follows. First, for any set \(B_{1}\) of real numbers, write \(g_{1}^{-1}\left(B_{1}\right)=\left\{\right.\) real numbers \(x: \quad g_{1}(x)\) is in \(\left.B_{1}\right\}\) . It is clear that the event that \(Y_{1}\) is in \(B_{1}\) occurs if and only if the event that \(X_{1}\) is in \(g_{1}^{-1}\left(B_{1}\right)\) occurs. Similarly, for any set \(B_{2}\) the events that \(Y_{2}\) is in \(B_{2}\) and \(X_{2}\) is in \(g_{2}^{-1}\left(B_{2}\right)\) occur, or fail to occur, together. Consequently, by (6.1)
\begin{align} P\left[Y_{1} \text { is in } B_{1}, Y_{2} \text { is in } B_{2}\right] & =P\left[X_{1} \text { is in } g_{1}^{-1}\left(B_{1}\right), X_{2} \text { is in } g_{2}^{-1}\left(B_{2}\right)\right] \tag{6.5}\\ & =P\left[X_{1} \text { is in } g_{1}^{-1}\left(B_{1}\right)\right] P\left[X_{2} \text { is in } g_{2}^{-1}\left(B_{2}\right)\right] \\ & =P\left[g_{1}\left(X_{1}\right) \text { is in } B_{1}\right] P\left[g_{2}\left(X_{2}\right) \text { is in } B_{2}\right] \\ & =P\left[Y_{1} \text { is in } B_{1}\right] P\left[Y_{2} \text { is in } B_{2}\right], \end{align}
and the proof of theorem 6A is concluded.
Example 6B . Sound intensity is often measured in decibels. A reference level of intensity \(I_{0}\) is adopted. Then a sound of intensity \(X\) is reported as having \(Y\) decibels:
\[Y=10 \log _{10} \frac{X}{I_{0}}.\]
Now if \(X_{1}\) and \(X_{2}\) are the sound intensities at two different points on a city street, let \(Y_{1}\) and \(Y_{2}\) be the corresponding sound intensities measured in decibels. If the original sound intensities \(X_{1}\) and \(X_{2}\) are independent random variables, then from theorem \(6 \mathrm{~A}\) it follows that \(Y_{1}\) and \(Y_{2}\) are independent random variables.
The foregoing notions extend at once to several jointly distributed random variables. We define \(n\) jointly distributed random variables \(X_{1}, X_{2}, \ldots, X_{n}\) as independent if any one of the following equivalent conditions holds: (i) for any \(n\) Borel sets \(B_{1}, B_{2}, \ldots, B_{n}\) of real numbers \begin{align} & P\left[X_{1} \text { is in } B_{1}, X_{2} \text { is in } B_{2}, \ldots, X_{n} \text { is in } B_{n}\right] \tag{6.6}\\ & \quad=P\left[X_{1} \text { is in } B_{1}\right] P\left[X_{2} \text { is in } B_{2}\right] \cdots P\left[X_{n} \text { is in } B_{n}\right] \end{align} (ii) for any real numbers \(x_{1}, x_{2}, \ldots, x_{n}\)
\[F_{X_{1}, X_{2}, \ldots, X_{n}}\left(x_{1}, x_{2}, \ldots, x_{n}\right)=F_{X_{1}}\left(x_{1}\right) F_{X_{2}}\left(x_{2}\right) \cdots F_{X_{n}}\left(x_{n}\right); \tag{6.7}\]
(iii) if the random variables are jointly continuous, then for any real numbers \(x_{1}, x_{2}, \ldots, x_{n}\)
\[f_{X_{1}, X_{2}, \ldots, X_{n}}\left(x_{1}, x_{2}, \ldots, x_{n}\right)=f_{X_{1}}\left(x_{1}\right) f_{X_{2}}\left(x_{2}\right) \cdots f_{X_{n}}\left(x_{n}\right); \tag{6.8}\]
(iv) if the random variables are jointly discrete, then for any real numbers \(x_{1}, x_{2}, \ldots, x_{n}\)
\[p_{X_{1}, X_{2}, \ldots, X_{n}}\left(x_{1}, x_{2}, \ldots, x_{n}\right)=p_{X_{1}}\left(x_{1}\right) p_{X_{2}}\left(x_{2}\right) \cdots p_{X_{n}}\left(x_{n}\right) \tag{6.9}\]
Theoretical Exercises
6.1 . Give an example of 3 random variables, \(X_{1}, X_{2}, X_{3}\) , which are independent when taken two at a time but not independent when taken together. Hint: Let \(A_{1}, A_{2}, A_{3}\) be events that have the properties asserted; see example 1C of Chapter 3. Define \(X_{i}=1\) or 0, depending on whether the event \(A_{i}\) has or has not occurred.
6.2 . Give an example of two random variables, \(X_{1}\) and \(X_{2}\) , which are not independent, but such that \(X_{1}^{2}\) and \(X_{2}^{2}\) are independent. Does such an example prove that the converse of theorem \(6 \mathrm{~A}\) is false?
6.3 . Factorization rule for the probability density function of independent random variables . Show that \(n\) jointly continuous random variables \(X_{1}, X_{2}, \ldots, X_{n}\) are independent if and only if their joint probability density function for all real numbers \(x_{1}, x_{2}, \ldots, x_{n}\) may be written \[f_{X_{1}, X_{2}, \ldots, X_{n}}\left(x_{1}, x_{2}, \ldots, x_{n}\right)=h_{1}\left(x_{1}\right) h_{2}\left(x_{2}\right) \cdots h_{n}\left(x_{n}\right)\] in terms of some Borel functions \(h_{1}(\cdot), h_{2}(\cdot), \ldots\) , and \(h_{n}(\cdot)\) .
Exercises
6.1 . The output of a certain electronic apparatus is measured at 5 different times. Let \(X_{1}, X_{2}, \ldots, X_{5}\) be the observations obtained. Assume that \(X_{1}, X_{2}, \ldots, X_{5}\) are independent random variables, each Rayleigh distributed with parameter \(\alpha=2\) . Find the probability that maximum \(\left(X_{1}, X_{2}, X_{3}, X_{4}, X_{5}\right)>4\) . (Recall that \(f_{X_{i}}(x)=\frac{x}{4} e^{-x^{2} / 8}\) for \(x>0\) and is equal to 0 elsewhere.)
Answer
\(1-\left(1-e^{-9}\right)^{5}\) .
6.2 . Suppose 10 identical radar sets have a failure law following the exponential distribution. The sets operate independently of one another and have a failure rate of \(\lambda=1 \mathrm{set} / 10^{3}\) hours. What length of time will all 10 radar sets operate satisfactorily with a probability of 0.99?
6.3 . Let \(X\) and \(Y\) be jointly continuous random variables, with a probability density function
\[f_{X, Y}(x, y)=\frac{1}{2 \pi} \exp \left[-\frac{1}{2}\left(x^{2}+y^{2}\right).\right]\]
(i) Are \(X\) and \(Y\) independent random variables?
(ii) Are \(X\) and \(Y\) identically distributed random variables?
(iii) Are \(X\) and \(Y\) normally distributed random variables?
(iv) Find \(P\left[X^{2}+Y^{2} \leq 4\right]\) . Hint : Use polar coordinates.
(v) Are \(X^{2}\) and \(Y^{2}\) independent random variables? Hint : Use theorem \(6 \mathrm{~A}\) .
(vi) Find \(P\left[X^{2} \leq 2\right], P\left[Y^{2} \leq 2\right]\) .
(vii) Find the individual probability density functions of \(X^{2}\) and \(Y^{2}\) . [Use (8.8) .]
(viii) Find the joint probability density function of \(X^{2}\) and \(Y^{2}\) . [Use (6.3) .]
(ix) Would you expect that \(P\left[X^{2} Y^{2} \leq 4\right] \geq P\left[X^{2} \leq 2\right] P\left[Y^{2} \leq 2\right]\) ?
(x) Would you expect that \(P\left[X^{2} Y^{2} \leq 4\right]=P\left[X^{2} \leq 2\right] P\left[Y^{2} \leq 2\right]\) ?
Answer
(i) Yes; (ii) yes; (iii) yes; (iv) \(1-e^{-2}\) ; (v) yes; (vi) 0.8426; (vii) \(f_{X^{2}}(y)=\) \(\frac{1}{\sqrt{2 \pi y}} e^{-y / 2}\) for \(y>0 ;=0\) otherwise; (viii) \(f_{X^{2}, Y^{2}}(u, v)=\frac{1}{2 \pi \sqrt{u v}} e^{-(u+v) / 2}\) for \(u, v>0\) ; = 0 otherwise; (ix) yes; (x) no.
6.4 . Let \(X_{1}, X_{2}\) , and \(X_{3}\) be independent random variables, each uniformly distributed on the interval 0 to 1. Determine the number \(a\) such that
(i) \(P\) [at least one of the numbers \(X_{1}, X_{2}, X_{3}\) is greater than \(\left.a\right]=0.9\) .
(ii) \(P\) [at least 2 of the numbers \(X_{1}, X_{2}, X_{3}\) is greater than \(\left.a\right]=0.9\) .
Hint : To obtain a numerical answer, use the table of binomial probabilities.
6.5 . Consider two events \(A\) and \(B\) such that \(P[A]=\frac{1}{4}, P[B \mid A]=\frac{1}{2}\) , and \(P[A \mid B]=\frac{1}{4}\) . Let the random variables \(X\) and \(Y\) be defined as \(X=1\) or 0, depending on whether the event \(A\) has or has not occurred, and \(Y=1\) or 0, depending on whether the event \(B\) has or has not occurred. State whether each of the following statements, is true or false:
(i) The random variables \(X\) and \(Y\) are independent;
(ii) \(P\left[X^{2}+Y^{2}=1\right]=\frac{1}{4}\) ;
(iii) \(P\left[X Y=X^{2} Y^{2}\right]=1\) ;
(iv) The random variable \(X\) is uniformly distributed on the interval 0 to 1;
(v) The random variables \(X\) and \(Y\) are identically distributed.
Answer
(i) True; (ii) false; (iii) true; (iv) false; (v) false.
6.6 . Show that the two random variables \(X_{1}\) and \(X_{2}\) considered in exercise 5.7 are independent if their joint probability mass function is given by Table \(5 \mathrm{~A}\) , and are dependent if their joint probability mass function is given by Table 5B.
In exercises 6.7 to 6.9 let \(X_{1}\) and \(X_{2}\) be independent random variables, uniformly distributed over the interval 0 to 1.
6.7 . Find (i) \(P\left[X_{1}+X_{2}<0.5\right]\) , (ii) \(P\left[X_{1}-X_{2}<0.5\right]\) .
Answer
(i) 0.125; (ii) 0.875.
6.8 . Find (i) \(P\left[X_{1} X_{2}<0.5\right]\) , (ii) \(P\left[X_{1} / X_{2}<0.5\right]\) , (iii) \(P\left[X_{1}^{2}<0.5\right]\) .
6.9 . Find (i) \(P\left[X_{1}^{2}+X_{2}^{2}<0.5\right]\) , (ii) \(P\left[e^{-X_{1}}<0.5\right]\) , (iii) \(P\left[\cos \pi X_{2}<0.5\right]\) .
Answer
(i) 0.393; (ii) \(1-\ln 2 \doteq 0.307\) ; (iii) \(\frac{2}{3}\) .