In this section we give various inversion formulas for the distribution function, probability mass function, and probability density function of a random variable in terms of its characteristic function. As a consequence of these formulas, it follows that to describe the probability law of a random variable it suffices to specify its characteristic function .
We first prove a theorem that gives in terms of characteristic functions an explicit formula for \(E[g(X)]\) for a fairly large class of functions \(g(\cdot)\) .
Theorem 3A. Let \(g(\cdot)\) be a bounded Borel function of a real variable that at every point \(x\) possesses a limit from the right \(g(x+0)\) and a limit from the left \(g(x-0)\) . Let \[g^{*}(x)=\frac{g(x+0)+g(x-0)}{2} \tag{3.1}\] be the arithmetic mean of these limits. Assume further that \(g(x)\) is absolutely integrable; that is, \[\int_{-\infty}^{\infty}|g(x)| d x<\infty. \tag{3.2}\] Define \(\gamma(\cdot)\) as the Fourier integral (or transform) of \(g(\cdot)\) ; that is, for every real number \(u\) \[\gamma(u)=\frac{1}{2 \pi} \int_{-\infty}^{\infty} e^{-i u x} g(x) d x. \tag{3.3}\] Then, for any random variable \(X\) the expectation \(E\left[g^{*}(X)\right]\) may be expressed in terms of the characteristic function \(\phi_{X}(\cdot)\) : \[E\left[g^{*}(X)\right]=\int_{-\infty}^{\infty} g^{*}(x) d F_{X}(x)=\lim _{U \rightarrow \infty} \int_{-U}^{U}\left(1-\frac{|u|}{U}\right) \gamma(u) \phi_{X}(u) d u. \tag{3.4}\]
The proof of this important theorem is given in section 5 . In this section we discuss its consequences.
If the product \(\gamma(u) \phi_{X}(u)\) is absolutely integrable, that is, \[\int_{-\infty}^{\infty}\left|\gamma(u) \phi_{X}(u)\right| d u<\infty, \tag{3.5}\] then (3.4) may be written \[E\left[g^{*}(X)\right]=\int_{-\infty}^{\infty} \gamma(u) \phi_{X}(u) du.\tag{3.6}\] Without imposing the condition (3.5), it is incorrect to write (3.6). Indeed, in order even to write (3.6) the integral on the right-hand side of (3.6) must exist; this is equivalent to (3.5) being true.
We next take for \(g(\cdot)\) a function defined as follows, for some finite numbers \(a\) and \(b\) (with \(a): \begin{align} g(x) & = \begin{cases} 1, & \text{if } a < x < b \tag{3.7} \\ \frac{1}{2}, & \text{if } x = a \text{ or } x = b \\ 0, & \text{if } x < a \text{ or } x > b. \end{cases} \end{align} The function \(g(\cdot)\) defined by (3.7) fulfills the hypotheses of theorem 3A; it is bounded, absolutely integrable, and possesses right-hand and left-hand limits at any point \(x\) . Further, for every \(x, g^{*}(x)=g(x)\) . Now, if \(a\) and \(b\) are points at which the distribution function \(F_{X}(\cdot)\) is continuous, then \[\int_{-\infty}^{\infty} g(x) d F_{X}(x)=F_{X}(b)-F_{X}(a). \tag{3.8}\] Further, \[\gamma(u)=\frac{1}{2 \pi} \frac{e^{-i u b}-e^{-i u a}}{-i u}. \tag{3.9}\] Consequently, with this choice of function \(g(\cdot)\) , theorem 3A yields an expression for the distribution function of a random variable in terms of its characteristic function .
Theorem 3B . If \(a\) and \(b\) , where \(a, are finite real numbers at which the distribution function \(F_{X}(\cdot)\) is continuous, then \[F_{X}(b)-F_{X}(a)=\lim _{U \rightarrow \infty} \frac{1}{2 \pi} \int_{-U}^{U}\left(1-\frac{|u|}{U}\right) \frac{e^{-i u b}-e^{-i u a}}{-i u} \phi_{X}(u) du. \tag{3.10}\]
Equation (3.10) constitutes an inversion formula , whereby, with a knowledge of the characteristic function \(\phi_{X}(\cdot)\) , a knowledge of the distribution function \(F_{X}(\cdot)\) may be obtained.
An explicit inversion formula for \(F_{X}(x)\) in terms of \(\phi_{X}(\cdot)\) may be written in various ways. Since \(\lim_{x \rightarrow -\infty} F_{X}(a)=0\) , we determine from (3.10) that at any point \(x\) where \(F_{X}(\cdot)\) is continuous \[F_{X}(x)=\lim _{a \rightarrow-\infty} \lim _{U \rightarrow \infty} \frac{1}{2 \pi} \int_{-U}^{U}\left(1-\frac{|u|}{U}\right) \frac{e^{-i u x}-e^{-i u a}}{-i u} \phi_{X}(u) du. \tag{3.11}\] The limit is taken over the set of points \(a\) , which are continuity points of \(F_{X}(\cdot)\) .
A more useful inversion formula, the proof of which is given in section 5, is the following: at any point \(x\) , where \(F_{X}(\cdot)\) is continuous, \[F_{X}(x)=\frac{1}{2}-\frac{1}{\pi} \int_{0}^{\infty} \frac{\operatorname{Im}\left[e^{-i u x} \phi_{X}(u)\right]}{u} du. \tag{3.12}\] The integral is an improper Riemann integral, defined as \[\lim _{U \rightarrow \infty} \int_{1 / U}^{U} \frac{\operatorname{Im}\left[e^{-i u x} \phi_{X}(u)\right]}{u} du.\] Equations (3.11) and (3.12) lead immediately to the uniqueness theorem, which states that there is a one-to-one correspondence between distribution functions and characteristic functions; two characteristic functions that are equal at all points (or equal at all except a countable number of points) are the characteristic functions of the same distribution function, and two distribution functions that are equal at all except a countable number of points give rise to the same characteristic function.
We may express the probability mass function \(p_{X}(\cdot)\) of the random variable \(X\) in terms of its characteristic function; for any real number \(x\) \begin{align} p_{X}(x) & =P[X=x]=F_{X}(x+0)-F_{X}(x-0) \tag{3.13}\\[5mm] & =\lim _{U \rightarrow \infty} \frac{1}{2 U} \int_{-U}^{U} e^{-i u x} \phi_{X}(u) du. \end{align} The proof of (3.13) is given in section 5 .
It is possible to give a criterion in terms of characteristic functions that a random variable \(X\) has an absolutely continuous probability law. 1 If the characteristic function \(\phi_{X}(\cdot)\) is absolutely integrable, that is , \[\int_{-\infty}^{\infty}\left|\phi_{X}(u)\right| d u<\infty, \tag{3.14}\] then the random variable \(X\) obeys the absolutely continuous probability law specified by the probability density function \(f_{X}(\cdot)\) for any real number \(x\) given by \[f_{X}(x)=\frac{1}{2 \pi} \int_{-\infty}^{\infty} e^{-i u x} \phi_{X}(u) du. \tag{3.15}\] One expresses (3.15) in words by saying that \(f_{X}(\cdot)\) is the Fourier transform, or Fourier integral, of \(\phi_{X}(\cdot)\) .
The proof of (3.15) follows immediately from the fact that at any continuity points \(x\) and \(a\) of \(F_{X}(\cdot)\) \[F_{X}(x)-F_{X}(a)=\frac{1}{2 \pi} \int_{-\infty}^{\infty} \frac{e^{-i u x}-e^{-i u a}}{-i u} \phi_{X}(u) d u . \tag{3.16}\] Equation (3.16) follows from (3.6) in the same way that (3.10) followed from (3.4). It may be proved from (3.16) that (i) \(F_{X}(\cdot)\) is continuous at every point \(x\) , (ii) \(f_{X}(x)=(d / d x) F_{X}(x)\) exists at every real number \(x\) and is given by (3.15), (iii) for any numbers \(a\) and \(b, F_{X}(b)-F_{X}(a)=\int_{a}^{b} f_{X}(x) d x\) . From these facts it follows that \(F_{X}(\cdot)\) is specified by \(f_{X}(\cdot)\) and that \(f_{X}(x)\) is given by (3.15).
The inversion formula (3.15) provides a powerful method of calculating Fourier transforms and characteristic functions. Thus, for example, from a knowledge that \[\left(\frac{\sin (u / 2)}{u / 2}\right)^{2}=\int_{-\infty}^{\infty} e^{i u x} f(x) dx, \tag{3.17}\] where \(f(\cdot)\) is defined by \begin{align} f(x) & = \begin{cases} 1 - |x|, & \text{for } |x| \leq 1 \tag{3.18} \\ 0, & \text{otherwise.} \end{cases} \end{align} it follows by (3.15) that \[\int_{-\infty}^{\infty} e^{-i u x} \frac{1}{2 \pi}\left(\frac{\sin (x / 2)}{x / 2}\right)^{2} d x=f(u). \tag{3.19}\] Similarly, from \[\frac{1}{1+u^{2}}=\int_{-\infty}^{\infty} e^{i u x} \frac{1}{2} e^{-|x|} dx \tag{3.20}\] it follows that \[e^{-|u|}=\int_{-\infty}^{\infty} e^{-i u x} \frac{1}{\pi} \frac{1}{1+x^{2}} dx. \tag{3.21}\]
We note finally the following important formulas concerning sums of independent random variables, convolution of distribution functions, and products of characteristic functions . Let \(X_{1}\) and \(X_{2}\) be two independent random variables, with respective distribution functions \(F_{X_{1}}(\cdot)\) and \(F_{X_{2}}(\cdot)\) and respective characteristic functions \(\phi_{X_{1}}(\cdot)\) and \(\phi_{X_{2}}(\cdot)\) . It may be proved (see section 9 of Chapter 7) that the distribution function of the sum \(X+Y\) for any real number \(z\) is given by \[F_{X_{1}+X_{2}}(z)=\int_{-\infty}^{\infty} F_{X_{1}}(z-x) dF_{X_{2}}(x). \tag{3.22}\]
On the other hand, it is clear that the characteristic function of the sum for any real number \(u\) is given by \[\phi_{X_{1}+X_{2}}(u)=\phi_{X_{1}}(u) \phi_{X_{2}}(u), \tag{3.23}\] since, by independence of \(X_{1}\) and \(X_{2}, E\left[e^{i u\left(X_{1}+X_{2}\right)}\right]=E\left[e^{i u X_{1}}\right] E\left[e^{i u X_{2}}\right]\) . The distribution function \(F_{X_{1}+X_{2}}(\cdot)\) , given by (3.22), is said to be the convolution of the distribution functions \(F_{X_{1}}(\cdot)\) and \(F_{X_{2}}(\cdot)\) ; in symbols, one writes \(F_{X_{1}+X_{2}}=F_{X_{1}} * F_{X_{2}}\) .
Exercises
3.1. Verify (3.17), (3.19), (3.20), and (3.21).
3.2. Prove that if \(f_{1}(\cdot)\) and \(f_{2}(\cdot)\) are probability density functions, whose corresponding characteristic functions \(\phi_{1}(\cdot)\) and \(\phi_{2}(\cdot)\) are absolutely integrable, then
\[\int_{-\infty}^{\infty} f_{1}(y-x) f_{2}(x) d x=\frac{1}{2 \pi} \int_{-\infty}^{\infty} e^{-i u y}(u) \phi_{2}(u) d u. \tag{3.24}\]
3.3. Use (3.15), (3.17), and (3.24) to prove that
\[\frac{1}{2 \pi} \int_{-\infty}^{\infty} e^{-i u y}\left(\frac{\sin (u / 2)}{u / 2}\right)^{4} d u=\int_{-\infty}^{\infty} f(y-x) f(x) dx. \tag{3.25}\]
Evaluate the integral on the right-hand side of (3.25).
Answer
\(\frac{2}{3}-y^{2}+\frac{1}{2}|y|^{3}\) for \(|y| \leq 1 ; \frac{4}{3}-2|y|+y^{2}-\frac{1}{6}|y|^{3}\) for \(1 \leq|y| \leq 2; 0\) otherwise.
3.4. Let \(X\) be uniformly distributed over the interval 0 to \(\pi\) . Let \(Y=A \cos X\) . Show directly that the probability density function of \(Y\) for any real number \(y\) is given by \begin{align} f_{Y}(y) & = \begin{cases} \frac{1}{\pi \sqrt{A^{2}-y^{2}}}, & \text{for } |y| < A \tag{3.26} \\ 0, & \text{otherwise.} \end{cases} \end{align} The characteristic function of \(Y\) may be written \[\phi_{Y}(u)=\frac{1}{\pi} \int_{0}^{\pi} e^{i u A \cos \theta} d u=J_{0}(A u), \tag{3.27}\] in which \(J_{0}(\cdot)\) is the Bessel function of order 0, defined for our purposes by the integral in (3.27). Is it true or false that
\begin{align} \frac{1}{2 \pi} \int_{-\infty}^{\infty} e^{-i u y} J_{0}(A u) \, du & = \begin{cases} \frac{1}{\pi \sqrt{A^{2}-y^{2}}}, & \text{if } |y| < A \tag{3.28} \\ 0, & \text{otherwise?} \end{cases} \end{align}
3.5. The image interference distribution. The amplitude \(a\) of a signal received at a distance from a transmitter may fluctuate because the signal is both directly received and reflected (reflected either from the ionosphere or the ocean floor, depending on whether it is being transmitted through the air or the ocean). Assume that the amplitude of the direct signal is a constant \(a_{1}\) and the amplitude of the reflected signal is a constant \(a_{2}\) but that the phase difference \(\theta\) between the two signals changes randomly and is uniformly distributed over the interval 0 to \(\pi\) . The amplitude \(a\) of the received signal is then given by \(a^{2}=a_{1}^{2}+a_{2}^{2}+2 a_{1} a_{2} \cos \theta\) . Assuming these facts, show that the characteristic function of \(a^{2}\) is given by
\[\phi_{a^{2}}(u)=e^{i u\left(a_{1}^{2}+a_{2}^{2}\right)} J_{0}\left(2 a_{1} a_{2} u\right). \tag{3.29}\]
Use this result and the preceding exercise to deduce the probability density function of \(a^{2}\) .
Answer
\(\left(\pi^{2}\left[4 a_{1}^{2} a_{2}^{2}-x^{2}\right]\right)^{-1 / 2}\) for \(\left|x-a_{1}^{2}-a_{2}^{2}\right|<2 a_{1} a_{2}; 0\) otherwise.
- In this section we use the terminology “an absolutely continuous probability law” for what has previously been called in this book “a continuous probability law”. This is to call the reader’s attention to the fact that in advanced probability theory it is customary to use the expression “absolutely continuous” rather than “continuous”. A continuous probability law is then defined as one corresponding to a continuous distribution function. ↩︎