Let \(\lambda > 0\). The independent random variables \(X_1, X_2, \ldots, X_n\) all have probability density function
$$f(t) = \begin{cases} \lambda e^{-\lambda t} & t \geq 0 \\ 0 & t < 0 \end{cases}$$
and cumulative distribution function \(F(x)\).
The value of random variable \(Y\) is the largest of the values \(X_1, X_2, \ldots, X_n\).
Show that the cumulative distribution function of \(Y\) is given, for \(y \geq 0\), by
$$G(y) = (1 - e^{-\lambda y})^n$$
The values \(L(\alpha)\) and \(U(\alpha)\), where \(0 < \alpha \leq \frac{1}{2}\), are such that
$$P(Y < L(\alpha)) = \alpha \text{ and } P(Y > U(\alpha)) = \alpha$$
Show that
$$L(\alpha) = -\frac{1}{\lambda}\ln(1 - \alpha^{1/n})$$
and write down a similar expression for \(U(\alpha)\).
Use the approximation \(e^t \approx 1 + t\), for \(|t|\) small, to show that, for sufficiently large \(n\),
$$\lambda L(\alpha) \approx \ln(n) - \ln\left(\ln\left(\frac{1}{\alpha}\right)\right)$$
Hence show that the median of \(Y\) tends to infinity as \(n\) increases, but that the width of the interval \(U(\alpha) - L(\alpha)\) tends to a value which is independent of \(n\).
You are given that, for \(|t|\) small, \(\ln(1 + t) \approx t\) and that \(e^3 \approx 20\).
Show that, for sufficiently large \(n\), there is an interval of width approximately \(4\lambda^{-1}\) in which \(Y\) lies with probability \(0.9\).
Solution:
Note that \(\displaystyle F(y) = \mathbb{P}(X_i < y) = \int_0^y \lambda e^{-\lambda t} \d t = 1-e^{-\lambda y}\).
Notice also that
\begin{align*}
G(y) &= \mathbb{P}(Y < y) \\
&= \mathbb{P}(\max_i(X_i) < y) \\
&= \mathbb{P}(X_i < y \text{ for all }i) \\
&= \prod_{i=1}^n \mathbb{P}(X_i < y) \\
&= \prod_{i=1}^n (1-e^{-\lambda y})\\
&= (1-e^{-\lambda y})^n
\end{align*}
as required.
Show that, for any functions \(f\) and \(g\), and for any \(m \geq 0\),
$$\sum_{r=1}^{m+1} f(r)\sum_{s=r-1}^m g(s) = \sum_{s=0}^m g(s)\sum_{r=1}^{s+1} f(r)$$
The random variables \(X_0, X_1, X_2, \ldots\) are defined as follows:
\(X_0\) takes the value \(0\) with probability \(1\);
\(X_{n+1}\) takes the values \(0, 1, \ldots, X_n + 1\) with equal probability, for \(n = 0, 1, \ldots\)
Write down \(E(X_1)\).
Find \(P(X_2 = 0)\) and \(P(X_2 = 1)\) and show that \(P(X_2 = 2) = \frac{1}{6}\).
Hence calculate \(E(X_2)\).
For \(n \geq 1\), show that
$$P(X_n = 0) = \sum_{s=0}^{n-1} \frac{P(X_{n-1} = s)}{s+2}$$
and find a similar expression for \(P(X_n = r)\), for \(r = 1, 2, \ldots, n\).
Hence show that \(E(X_n) = \frac{1}{2}(1 + E(X_{n-1}))\).
Find an expression for \(E(X_n)\) in terms of \(n\), for \(n = 1, 2, \ldots\)
Solution:
\begin{align*}
\sum_{r=1}^{m+1} \left (f(r) \sum_{s=r-1}^m g(s) \right) &= \sum_{r=1}^{m+1} \sum_{s=r-1}^m f(r)g(s) \\
&= \sum_{(r,s) \in \{(r,s) : 1 \leq r \leq m+1, 0 \leq s \leq m, s \geq r-1\}} f(r)g(s) \\
&= \sum_{(r,s) \in \{(r,s) : 0 \leq s \leq m, 1 \leq r \leq m+1, r \leq s+1\}} f(r)g(s) \\
&= \sum_{s=0}^m \sum_{r=1}^{s+1} f(r)g(s) \\
&= \sum_{s=0}^m \left ( g(s) \sum_{r=1}^{s+1} f(r) \right)
\end{align*}
\(X_1\) takes the values \(0, 1\) with equal probabilities (since \(X_0 = 0\)). Therefore \(\mathbb{E}(X_1) = \frac12\).