22 problems found
Solution:
Each of the independent random variables \(X_1, X_2, \ldots, X_n\) has the probability density function \(\mathrm{f}(x) = \frac{1}{2}\sin x\) for \(0 \leqslant x \leqslant \pi\) (and zero otherwise). Let \(Y\) be the random variable whose value is the maximum of the values of \(X_1, X_2, \ldots, X_n\).
\(A\) and \(B\) both toss the same biased coin. The probability that the coin shows heads is \(p\), where \(0 < p < 1\), and the probability that it shows tails is \(q = 1 - p\). Let \(X\) be the number of times \(A\) tosses the coin until it shows heads. Let \(Y\) be the number of times \(B\) tosses the coin until it shows heads.
A random process generates, independently, \(n\) numbers each of which is drawn from a uniform (rectangular) distribution on the interval 0 to 1. The random variable \(Y_k\) is defined to be the \(k\)th smallest number (so there are \(k-1\) smaller numbers).
Solution:
An internet tester sends \(n\) e-mails simultaneously at time \(t=0\). Their arrival times at their destinations are independent random variables each having probability density function \(\lambda \e^{-\lambda t}\) (\(0\le t<\infty\), \( \lambda >0\)).
Solution:
In this question, you may use without proof the results: \[ \sum_{r=1}^n r = \tfrac12 n(n+1) \qquad\text{and}\qquad \sum_{r=1}^n r^2 = \tfrac1 6 n(n+1)(2n+1)\,. \] The independent random variables \(X_1\) and \(X_2\) each take values \(1\), \(2\), \(\ldots\), \(N\), each value being equally likely. The random variable \(X\) is defined by \[ X= \begin{cases} X_1 & \text { if } X_1\ge X_2\\ X_2 & \text { if } X_2\ge X_1\;. \end{cases} \]
Solution: \begin{align*} \P(X = r) &= \P(X_1 = r, X_2 \leq r) + \P(X_2 = r, X_1 < r) \\ &= \P(X_1 = r) \P(X_2 \leq r) + \P(X_2 = r)\P( X_1 < r) \\ &= \frac{1}{N} \frac{r}{N} + \frac{1}{N} \frac{r-1}{N} \\ &= \frac{2r-1}{N^2} \end{align*} \begin{align*} \E(X) &= \sum_{r=1}^N r \P(X = r) \\ &= \sum_{r=1}^N \frac{2r^2 - r}{N^2} \\ &= \frac{1}{N^2} \l \frac{N(N+1)(2N+1)}{3} - \frac{N(N+1)}{2} \r \\ &= \frac{N+1}{N} \l \frac{4N-1}{6} \r \end{align*} When \(N = 100\), this is equal to \(\frac{101 \cdot 399}{6 \cdot 100} = \frac{101 \cdot 133}{200} = 67.165\) \begin{align*} &&\frac12 &\leq \P(X \leq m) \\ &&&=\sum_{r=1}^m \P(X=r) \\ &&&=\sum_{r=1}^m \frac{2r-1}{N^2} \\ &&&= \frac{1}{N^2} \l m(m+1) - m \r \\ &&&= \frac{m^2}{N^2} \\ \Rightarrow && m^2 &\geq \frac{N^2}{2} \\ \Rightarrow && m &\geq \frac{N}{\sqrt{2}} \\ \Rightarrow && m &= \left \lceil \frac{N}{\sqrt{2}} \right \rceil \end{align*} When \(N = 100\), \(100/\sqrt{2} = \sqrt{2}50\). \(\sqrt{2} > 1.4 \Rightarrow 50\sqrt{2} > 70\) \(\sqrt{2} < 1.42 \Rightarrow 50 \sqrt{2} < 71\), therefore \(\displaystyle \left \lceil \frac{100}{\sqrt{2}} \right \rceil = 71\) \begin{align*} \lim_{N \to \infty} \frac{\frac{(N+1)(4N-1)}{6N}}{ \left \lceil\frac{N}{\sqrt{2}} \right \rceil} &= \lim_{N \to \infty} \frac{\sqrt{2}}{3}\l \frac{4N^2 +3N - 1}{2N^2} \r \tag{since the floor will be irrelevant}\\ &= \lim_{N \to \infty} \frac{\sqrt{2}}{3}\l 2 + \frac{3}{2N} - \frac{1}{N^2} \r \\ &= \lim_{N \to \infty} \frac{2\sqrt{2}}{3} \end{align*}
Solution:
Solution: \begin{align*} \P(X \leq 0.8) &= \P(X_1 \leq 0.8,X_2 \leq 0.8,X_3 \leq 0.8) \\ &= 0.8^3 \\ &= 0.512 \end{align*} \begin{align*} && \P(X < c) &= c^3 \\ \Rightarrow && f_X(x) &= 3x^2 \\ \Rightarrow && \E[X] &= \int_0^1 x \cdot (3x^2) \, dx \\ && &= \left [ \frac{3}{4}x^4 \right]_0^1 \\ &&&= \frac{3}{4} \end{align*} \(X\) is distributed the maximum of \(N\) numbers on \([0,a]\). \begin{align*} H_0 : & x= 1 \\ H_1 : & x < 1 \end{align*} \begin{align*} &&\P(X < c) &= c^N \\ &&&= \frac1{20} \\ \Rightarrow && N &= -\frac{\log(20)}{\log(c)} \end{align*} where \(c = 0.8\), we have \begin{align*} N &= \frac{\log(20)}{\log(5/4)} \\ &= \frac{\log(5)+\log(4)}{\log(5)-\log(4)} \\ &= \frac{ \frac{\log(5)}{\log(4)}+1}{\frac{\log(5)}{\log(4)} - 1} \end{align*} \begin{align*} && 2^{10} &\approx 10^{3} \\ && 10\log(2) &\approx 3 (\log(5) + \log(2)) \\ && 7\log(2) &\approx 3 \log(5) \\ && \frac{\log(5)}{2\log(2)} &\approx \frac{7}{6} \end{align*} \begin{align*} &= \frac{ \frac{\log(5)}{\log(4)}+1}{\frac{\log(5)}{\log(4)} - 1} &= \frac{\frac{7}{6} + 1}{\frac{7}{6} -1} \\ &= 13 \end{align*} Since \(2^{10} > 10^3\) then \(N=14\) is the value we seek. \(\P(X < 0.8 | a= 0.8) = 1\) \(\P(X < 0.8 | a= 0.9, N=14) = \frac{8^{14}}{9^{14}}\)
The life times of a large batch of electric light bulbs are independently and identically distributed. The probability that the life time, \(T\) hours, of a given light bulb is greater than \(t\) hours is given by \[ \P(T>t) \; = \; \frac{1}{(1+kt)^\alpha}\;, \] where \(\alpha\) and \(k\) are constants, and \(\alpha >1\). Find the median \(M\) and the mean \(m\) of \(T\) in terms of \(\alpha\) and \(k\). Nine randomly selected bulbs are switched on simultaneously and are left until all have failed. The fifth failure occurs at 1000 hours and the mean life time of all the bulbs is found to be 2400 hours. Show that \(\alpha\approx2\) and find the approximate value of \(k\). Hence estimate the probability that, if a randomly selected bulb is found to last \(M\) hours, it will last a further \(m-M\) hours.
Solution: The median \(M\) is the value such that \begin{align*} && \frac12 &= \mathbb{P}(T > M) \\ &&&= \frac1{(1+kM)^\alpha} \\ \Rightarrow && 2 &= (1+kM)^{\alpha} \\ \Rightarrow && M &= \frac{2^{1/\alpha}-1}{k} \end{align*} The distribution of \(T\) is \(f_T(t) = \frac{k \alpha}{(1+kt)^{\alpha+1}}\) and so \begin{align*} && m &= \int_0^\infty t f_T(t) \d t \\ &&&= \int_0^\infty \frac{tk \alpha}{(1+kt)^{\alpha+1}} \d t \\ &&&= \int_0^\infty \frac{\alpha+tk \alpha-\alpha}{(1+kt)^{\alpha+1}} \d t \\ &&&= \alpha \int_0^\infty (1+kt)^{-\alpha} \d t - \alpha \int_0^\infty (1+kt)^{-(\alpha+1)} \d t \\ &&&= \alpha \left [ -\frac1{k(\alpha-1)}(1+kt)^{-\alpha+1}\right]_0^\infty- \alpha \left [ -\frac1{k\alpha}(1+kt)^{-\alpha}\right]_0^\infty \\ &&&= \frac{\alpha}{k(\alpha-1)} - \frac{1}{k} \\ &&&= \frac{1}{k(\alpha-1)} \end{align*} \begin{align*} && \frac{2^{1/\alpha}-1}{k} &= 1000 \\ && \frac{1}{k(\alpha-1)} &= 2400 \\ \Rightarrow && \frac{\alpha-1}{2^{1/\alpha}-1} &\approx 2.4 \\ && \frac{2-1}{\sqrt2-1} &= \sqrt{2}+1 \approx 2.4 \\ \Rightarrow && \alpha &\approx 2 \\ && k &= \frac{1}{2400} \end{align*} \begin{align*} && \mathbb{P}(T > m | T > M) &= \frac{\mathbb{P}(T > m)}{\mathbb{P}(T > M)} \\ &&&= \frac{2}{(1+km)^{\alpha}} \\ &&&= \frac{2}{(1 + \frac{1}{\alpha-1})^\alpha} \\ &&&\approx \frac{2}{4} =\frac12 \end{align*}
A random variable \(X\) is distributed uniformly on \([\, 0\, , \, a\,]\). Show that the variance of \(X\) is \({1 \over 12} a^2\). A sample, \(X_1\) and \(X_2\), of two independent values of the random variable is drawn, and the variance \(V\) of the sample is determined. Show that \(V = {1 \over 4} \l X_1 -X_2 \r ^2\), and hence prove that \(2 V\) is an unbiased estimator of the variance of X. Find an exact expression for the probability that the value of \(V\) is less than \({1 \over 12} a^2\) and estimate the value of this probability correct to one significant figure.
Solution: \begin{align*} && \E[X] &= \frac{a}{2}\tag{by symmetry} \\ &&\E[X^2] &= \int_0^a \frac{1}{a} x^2 \d x \\ &&&= \frac{a^3}{3a} = \frac{a^2}{3} \\ \Rightarrow && \var[X] &= \frac{a^2}{3} - \frac{a^2}{4} = \frac{a^2}{12} \\ \end{align*} \begin{align*} && V &=\frac{1}{2} \left ( \left ( X_1 - \frac{X_1+X_2}{2} \right )^2+\left ( X_2- \frac{X_1+X_2}{2} \right )^2 \right ) \\ &&&= \frac{1}{8} ((X_1 - X_2)^2 + (X_2 - X_1)^2 ) \\ &&&= \frac14 (X_1-X_2)^2 \\ \\ && \E[2V] &= \E \left [ \frac12 (X_1 - X_2)^2 \right] \\ &&&= \frac12 \E[X_1^2] - \E[X_1X_2] + \frac12 \E[X_2^2] \\ &&&= \frac{a^2}{3} - \frac{a^2}{4} = \frac{a^2}{12} \end{align*} Therefore \(2V\) is an unbiased estimator of the variance of \(X\).
The random variables \(X_1\), \(X_2\), \(\ldots\) , \(X_{2n+1}\) are independently and uniformly distributed on the interval \(0 \le x \le 1\). The random variable \(Y\) is defined to be the median of \(X_1\), \(X_2\), \(\ldots\) , \(X_{2n+1}\). Given that the probability density function of \(Y\) is \(\g(y)\), where \[ \mathrm{g}(y)=\begin{cases} ky^{n}(1-y)^{n} & \mbox{ if }0\leqslant y\leqslant1\\ 0 & \mbox{ otherwise} \end{cases} \] use the result $$ \int_0^1 {y^{r}}{{(1-y)}^{s}}\,\d y = \frac{r!s!}{(r+s+1)!} $$ to show that \(k={(2n+1)!}/{{(n!)}^2}\), and evaluate \(\E(Y)\) and \({\rm Var}\,(Y)\). Hence show that, for any given positive number \(d\), the inequality $$ {\P\left({\vert {Y - 1/2} \vert} < {d/{\sqrt {n}}} \right)} < {\P\left({\vert {{\bar X} - 1/2} \vert} < {d/{\sqrt {n}}} \right)} $$ holds provided \(n\) is large enough, where \({\bar X}\) is the mean of \(X_1\), \(X_2\), \(\ldots\) , \(X_{2n+1}\). [You may assume that \(Y\) and \(\bar X\) are normally distributed for large \(n\).]
A set of \(n\) dice is rolled repeatedly. For each die the probability of showing a six is \(p\). Show that the probability that the first of the dice to show a six does so on the \(r\)th roll is $$q^{n r } ( q^{-n} - 1 )$$ where \(q = 1 - p\). Determine, and simplify, an expression for the probability generating function for this distribution, in terms of \(q\) and \(n\). The first of the dice to show a six does so on the \(R\)th roll. Find the expected value of \(R\) and show that, in the case \(n = 2\), \(p=1/6\), this value is \(36/11\). Show that the probability that the last of the dice to show a six does so on the \(r\)th roll is \[ \big(1-q^r\big)^n-\big(1-q^{r-1}\big)^n. \] Find, for the case \(n = 2\), the probability generating function. The last of the dice to show a six does so on the \(S\)th roll. Find the expected value of \(S\) and evaluate this when \(p=1/6\).
A random variable \(X\) has the probability density function \[ \mathrm{f}(x)=\begin{cases} \lambda\mathrm{e}^{-\lambda x} & x\geqslant0,\\ 0 & x<0. \end{cases} \] Show that $${\rm P}(X>s+t\,\vert X>t) = {\rm P}(X>s).$$ The time it takes an assistant to serve a customer in a certain shop is a random variable with the above distribution and the times for different customers are independent. If, when I enter the shop, the only two assistants are serving one customer each, what is the probability that these customers are both still being served at time \(t\) after I arrive? One of the assistants finishes serving his customer and immediately starts serving me. What is the probability that I am still being served when the other customer has finished being served?
Solution: \begin{align*} && \mathbb{P}(X > t) &= \int_t^{\infty} \lambda e^{-\lambda x} \d x\\ &&&= \left[ -e^{-\lambda x} \right]_t^\infty \\ &&&= e^{-\lambda t}\\ \\ && \mathbb{P}(X > s + t | X > t) &= \frac{\mathbb{P}(X > s + t)}{\mathbb{P}(X > t)} \\ &&&= \frac{e^{-(s+t)\lambda}}{e^{-t\lambda}} \\ &&&= e^{-s\lambda} = \mathbb{P}(X > s) \end{align*} The probability both are still being served (independently) is \(\mathbb{P}(X > t)^2 = e^{-2\lambda t}\). The probability is exactly \(\frac12\). The property we proved in the first part of the questions shows the distribution is memoryless, ie we are both experiencing samples from the same distribution. Therefore we are equally likely to finish first.
A hostile naval power possesses a large, unknown number \(N\) of submarines. Interception of radio signals yields a small number \(n\) of their identification numbers \(X_i\) (\(i=1,2,...,n\)), which are taken to be independent and uniformly distributed over the continuous range from \(0\) to \(N\). Show that \(Z_1\) and \(Z_2\), defined by $$ Z_1 = {n+1\over n} {\max}\{X_1,X_2,...,X_n\} \hspace{0.3in} {\rm and} \hspace{0.3in} Z_2 = {2\over n} \sum_{i=1}^n X_i \;, $$ both have means equal to \(N\). Calculate the variance of \(Z_1\) and of \(Z_2\). Which estimator do you prefer, and why?
An examiner has to assign a mark between 1 and \(m\) inclusive to each of \(n\) examination scripts (\(n\leqslant m\)). He does this randomly, but never assigns the same mark twice. If \(K\) is the highest mark that he assigns, explain why \[ \mathrm{P}(K=k)=\left.\binom{k-1}{n-1}\right/\binom{m}{n} \] for \(n\leqslant k\leqslant m,\) and deduce that \[ \sum_{k=n}^{m}\binom{k-1}{n-1}=\binom{m}{n}\,. \] Find the expected value of \(K\).
Solution: If the highest mark is \(k\), then there are \(n-1\) remaining marks to give, and they have to be chosen from the numbers \(1, 2, \ldots, k-1\), ie in \(\binom{k-1}{n-1}\) ways. There are \(n\) numbers to be chosen from \(1, 2, \ldots, m\) in total, therefore \(\displaystyle \mathbb{P}(K=k) = \left.\binom{k-1}{n-1} \right/ \binom{m}{n}\) Since \(K\) can take any of the values \(n, \cdots, m\), we must have \begin{align*} && 1 &= \sum_{k=n}^m \mathbb{P}(K=k) \\ &&&= \sum_{k=n}^m \left.\binom{k-1}{n-1} \right/ \binom{m}{n} \\ \Rightarrow && \binom{m}{n} &= \sum_{k=n}^m \binom{k-1}{n-1} \\ \\ && \mathbb{E}(K) &= \sum_{k=n}^m k \cdot \mathbb{P}(K=k) \\ &&&= \sum_{k=n}^m k \cdot \left.\binom{k-1}{n-1} \right/ \binom{m}{n} \\ &&&= n\binom{m}{n}^{-1} \sum_{k=n}^m \frac{k}{n} \cdot \binom{k-1}{n-1} \\ &&&= n\binom{m}{n}^{-1} \sum_{k=n}^m \binom{k}{n} \\ &&&= n\binom{m}{n}^{-1} \sum_{k=n+1}^{m+1} \binom{k-1}{n+1-1} \\ &&&= n\binom{m}{n}^{-1} \binom{m+1}{n+1} \\ &&&= n \cdot \frac{m+1}{n+1} \end{align*}