Problems

Filters
Clear Filters

22 problems found

1996 Paper 2 Q12
D: 1600.0 B: 1500.0

  1. Let \(X_{1}, X_{2}, \dots, X_{n}\) be independent random variables each of which is uniformly distributed on \([0,1]\). Let \(Y\) be the largest of \(X_{1}, X_{2}, \dots, X_{n}\). By using the fact that \(Y<\lambda\) if and only if \(X_{j}<\lambda\) for \(1\leqslant j\leqslant n\), find the probability density function of \(Y\). Show that the variance of \(Y\) is \[\frac{n}{(n+2)(n+1)^{2}}.\]
  2. The probability that a neon light switched on at time \(0\) will have failed by a time \(t>0\) is \(1-\mathrm{e}^{-t/\lambda}\) where \(\lambda>0\). I switch on \(n\) independent neon lights at time zero. Show that the expected time until the first failure is \(\lambda/n\).


Solution:

  1. \(\,\) \begin{align*} && F_Y(\lambda) &= \mathbb{P}(Y < \lambda) \\ &&&= \prod_i \mathbb{P}(X_i < \lambda) \\ &&&= \lambda^n \\ \Rightarrow && f_Y(\lambda) &= \begin{cases} n \lambda^{n-1} & \text{if } 0 \leq \lambda \leq 1 \\ 0 & \text{otherwise} \end{cases} \\ \\ && \E[Y] &= \int_0^1 \lambda f_Y(\lambda) \d \lambda \\ &&&= \int_0^1 n \lambda^n \d \lambda \\ &&&= \frac{n}{n+1} \\ && \E[Y^2] &= \int_0^1 \lambda^2 f_Y(\lambda) \d \lambda \\ &&&= \int_0^1 n \lambda^{n+1} \d \lambda \\ &&&= \frac{n}{n+2} \\ \Rightarrow && \var[Y] &= \E[Y^2]-(\E[Y])^2 \\ &&&= \frac{n}{n+2} - \frac{n^2}{(n+1)^2} \\ &&&= \frac{(n+1)^2n-n^2(n+2)}{(n+2)(n+1)^2} \\ &&&= \frac{n[(n^2+2n+1)-(n^2+2n)]}{(n+2)(n+1)^2} \\ &&&= \frac{n}{(n+2)(n+1)^2} \end{align*}
  2. Using the same reasoning, we can see that \begin{align*} && 1-F_Z(t) &= \mathbb{P}(\text{all lights still on after t}) \\ &&&= \prod_i e^{-t/\lambda} \\ &&&= e^{-nt/\lambda} \\ \\ \Rightarrow && F_Z(t) &= 1-e^{-nt/\lambda} \end{align*} Therefore \(Z \sim Exp(\frac{n}{\lambda})\) and the time to first failure is \(\lambda/n\)

1995 Paper 1 Q13
D: 1500.0 B: 1484.0

A scientist is checking a sequence of microscope slides for cancerous cells, marking each cancerous cell that she detects with a red dye. The number of cancerous cells on a slide is random and has a Poisson distribution with mean \(\mu.\) The probability that the scientist spots any one cancerous cell is \(p\), and is independent of the probability that she spots any other one.

  1. Show that the number of cancerous cells which she marks on a single slide has a Poisson distribution of mean \(p\mu.\)
  2. Show that the probability \(Q\) that the second cancerous cell which she marks is on the \(k\)th slide is given by \[ Q=\mathrm{e}^{-\mu p(k-1)}\left\{ (1+k\mu p)(1-\mathrm{e}^{-\mu p})-\mu p\right\} . \]

1994 Paper 1 Q14
D: 1500.0 B: 1532.7

Each of my \(n\) students has to hand in an essay to me. Let \(T_{i}\) be the time at which the \(i\)th essay is handed in and suppose that \(T_{1},T_{2},\ldots,T_{n}\) are independent, each with probability density function \(\lambda\mathrm{e}^{-\lambda t}\) (\(t\geqslant0\)). Let \(T\) be the time I receive the first essay to be handed in and let \(U\) be the time I receive the last one.

  1. Find the mean and variance of \(T_{i}.\)
  2. Show that \(\mathrm{P}(U\leqslant u)=(1-\mathrm{e}^{-\lambda u})^{n}\) for \(u\geqslant0,\) and hence find the probability density function of \(U\).
  3. Obtain \(\mathrm{P}(T>t),\) and hence find the probability density function of \(T\).
  4. Write down the mean and variance of \(T\).


Solution:

  1. \(T_i \sim \textrm{Exp}(\lambda)\) so \(\E[T_i] = \lambda^{-1}, \var[T_i] = \lambda^{-2}\)
  2. \(\,\) \begin{align*} && \mathbb{P}(U \leq u) &= \mathbb{P}(T_i \leq u\quad \forall i) \\ &&&= \prod \mathbb{P}(T_i \leq u) \\ &&&= \prod \int_0^u \lambda e^{-\lambda t} \d t \\ &&&= (1-e^{-\lambda u})^n \\ \\ \Rightarrow && f_U(u) &= n\lambda e^{-\lambda u}(1-e^{-\lambda u})^{n-1} \end{align*}
  3. \(\,\) \begin{align*} && \mathbb{P}(T > t) &= \mathbb{P}(T_i > t \quad \forall i) \\ &&&= \prod \mathbb{P}(T_i > t) \\ &&&= e^{-n\lambda t} \\ \Rightarrow && f_T(t) &= n\lambda e^{-n\lambda t} \end{align*}
  4. Therefore \(\E[T] = \frac{1}{n\lambda}, \var[T] = \frac{1}{(n\lambda)^2}\)

1994 Paper 3 Q13
D: 1700.0 B: 1484.0

During his performance a trapeze artist is supported by two identical ropes, either of which can bear his weight. Each rope is such that the time, in hours of performance, before it fails is exponentially distributed, independently of the other, with probability density function \(\lambda\exp(-\lambda t)\) for \(t\geqslant0\) (and 0 for \(t < 0\)), for some \(\lambda > 0.\) A particular rope has already been in use for \(t_{0}\) hours of performance. Find the distribution for the length of time the artist can continue to use it before it fails. Interpret and comment upon your result. Before going on tour the artist insists that the management purchase two new ropes of the above type. Show that the probability density function of the time until both ropes fail is \[ \mathrm{f}(t)=\begin{cases} 2\lambda\mathrm{e}^{-\lambda t}(1-\mathrm{e}^{-\lambda t}) & \text{ if }t\geqslant0,\\ 0 & \text{ otherwise.} \end{cases} \] If each performance lasts for \(h\) hours, find the probability that both ropes fail during the \(n\)th performance. Show that the probability that both ropes fail during the same performance is \(\tanh(\lambda h/2)\).


Solution: This is the memoryless property of the exponential distribution so it has the same distribution as when \(t = 0\). Let \(T\) be the time the rope fails, then \begin{align*} && \mathbb{P}(T > t | T > t_0) &= \frac{\mathbb{P}(T > t)}{\mathbb{P}(T > t_0)} \\ &&&= \frac{e^{-\lambda t}}{e^{-\lambda t_0}} \\ &&&= e^{-\lambda(t-t_0)} \end{align*} This means that each rope (as long as it hasn't broken) can be considered "as good as new". Suppose \(T_1, T_2 \sim Exp(\lambda)\) are the time to failures for each rope, then \begin{align*} && \mathbb{P}(\max(T_1, T_2) < t) &= \mathbb{P}(T_1 < t, T_2 < t) \\ &&&= (1-e^{-\lambda t})^2 \\ \Rightarrow && f(t) &= 2(1-e^{-\lambda t}) \cdot (\lambda e^{-\lambda t}) \\ &&&= 2\lambda e^{-\lambda t}(1-e^{-\lambda t}) \end{align*} Therefore \(\max(T_1, T_2) \sim Exp(2\lambda)\) and the pdf is as described. \begin{align*} && \mathbb{P}(\text{both fail during the }n\text{th}) &= \left ( \int_{(n-1)h}^{nh} \lambda e^{-\lambda t} \d t \right)^2 \\ &&&=\left (\left [ -e^{-\lambda t}\right]_{(n-1)h}^{nh} \right)^2 \\ &&&= \left ( e^{-\lambda (n-1)h}( 1-e^{-\lambda h}) \right)^2 \\ &&&= e^{-2(n-1)h\lambda}(e^{-\lambda h}-1)^2 \\ \\ && \mathbb{P}(\text{both fail in same performance}) &= \sum_{n=1}^{\infty} \mathbb{P}(\text{both fail during the }n\text{th}) \\ &&&= \sum_{n=1}^{\infty}e^{-2(n-1)h\lambda}(e^{-\lambda h}-1)^2 \\ &&&= (e^{-\lambda h}-1)^2 \frac{1}{1-e^{-2h\lambda}} \\ &&&= \frac{e^{-\lambda h}-1}{1+e^{-h\lambda}} \\ &&&= \tanh(\lambda h/2) \end{align*}

1990 Paper 2 Q15
D: 1600.0 B: 1500.0

A target consists of a disc of unit radius and centre \(O\). A certain marksman never misses the target, and the probability of any given shot hitting the target within a distance \(t\) from \(O\) it \(t^{2}\), where \(0\leqslant t\leqslant1\). The marksman fires \(n\) shots independently. The random variable \(Y\) is the radius of the smallest circle, with centre \(O\), which encloses all the shots. Show that the probability density function of \(Y\) is \(2ny^{2n-1}\) and find the expected area of the circle. The shot which is furthest from \(O\) is rejected. Show that the expected area of the smallest circle, with centre \(O\), which encloses the remaining \((n-1)\) shots is \[ \left(\frac{n-1}{n+1}\right)\pi. \]


Solution: Another way to describe \(Y\) is the maximum distance of any shot from \(O\). Let \(X_i\), \(1 \leq i \leq n\) be the \(n\) shots then, \begin{align*} F_Y(y) &= \mathbb{P}(Y \leq y) \\ &= \mathbb{P}(X_i \leq y \text{ for all } i) \\ &= \prod_{i=1}^n \mathbb{P}(X_i \leq y) \tag{each shot independent}\\ &= \prod_{i=1}^n y^2\\ &= y^{2n} \end{align*} Therefore \(f_Y(y) = \frac{\d}{\d y} (y^{2n}) = 2n y^{2n-1}\). \begin{align*} \mathbb{E}(\pi Y^2) &= \int_0^1\pi y^2 \f_Y(y) \d y \\ &=\pi \int_0^1 2n y^{2n+1} \d y \\ &=\left ( \frac{n}{n+1} \right )\pi \end{align*}. Let \(Z\) be the distance of the second furthest shot, then: \begin{align*} && F_Z(z) &= \mathbb{P}(Z \leq z) \\ &&&= \mathbb{P}(X_i \leq z \text{ for at least } n - 1\text{ different } i) \\ &&&= n\mathbb{P}(X_i \leq z \text{ for all but 1}) + \mathbb{P}(X_i \leq z \text{ for all } i) \\ &&&= n \left ( \prod_{i=1}^{n-1} \mathbb{P}(X_i \leq z) \right) \mathbb{P}(X_n > z) + z^{2n} \\ &&&= nz^{2n-2}(1-z^2) + z^{2n} \\ &&&= nz^{2n-2} -(n-1)z^{2n} \\ \Rightarrow && f_Z(z) &= n(2n-2)z^{2n-3}-2n(n-1)z^{2n-1} \\ \Rightarrow && \mathbb{E}(\pi Z^2) &= \int_0^1 \pi z^2 \left (n(2n-2)z^{2n-3}-2n(n-1)z^{2n-1} \right) \d z \\ &&&= \pi \left ( \frac{n(2n-2)}{2n} - \frac{2n(n-1)}{2n+2}\right) \\ &&&= \left ( \frac{n-1}{n+1} \right) \pi \end{align*}

1989 Paper 3 Q15
D: 1700.0 B: 1503.8

The continuous random variable \(X\) is uniformly distributed over the interval \([-c,c].\) Write down expressions for the probabilities that:

  1. \(n\) independently selected values of \(X\) are all greater than \(k\),
  2. \(n\) independently selected values of \(X\) are all less than \(k\),
where \(k\) lies in \([-c,c]\). A sample of \(2n+1\) values of \(X\) is selected at random and \(Z\) is the median of the sample. Show that \(Z\) is distributed over \([-c,c]\) with probability density function \[ \frac{(2n+1)!}{(n!)^{2}(2c)^{2n+1}}(c^{2}-z^{2})^{n}. \] Deduce the value of \({\displaystyle \int_{-c}^{c}(c^{2}-z^{2})^{n}\,\mathrm{d}z.}\) Evaluate \(\mathrm{E}(Z)\) and \(\mathrm{var}(Z).\)


Solution:

  1. \begin{align*} \mathbb{P}(n\text{ independent values of }X > k) &= \prod_{i=1}^n \mathbb{P}(X > k) \\ &= \left ( \frac{c-k}{2c}\right)^n \end{align*}
  2. \begin{align*} \mathbb{P}(n\text{ independent values of }X < k) &= \prod_{i=1}^n \mathbb{P}(X < k) \\ &= \left ( \frac{k+c}{2c}\right)^n \end{align*}
\begin{align*} &&\mathbb{P}(\text{median} < z+\delta \text{ and median} > z - \delta) &= \mathbb{P}(n\text{ values } < z - \delta \text{ and } n \text{ values} > z + \delta) \\ &&&= \binom{2n+1}{n,n,1} \left ( \frac{c-(z+\delta)}{2c}\right)^n\left ( \frac{(z-\delta)+c}{2c}\right)^n \frac{2 \delta}{2 c} \\ &&&= \frac{(2n+1)!}{n! n!} \frac{((c-(z+\delta))(c+(z-\delta)))^n 2\delta}{2^n c^n} \\ &&&= \frac{(2n+1)!}{(n!)^2 (2c)^{2n+1}}((c-(z+\delta))(c+(z-\delta)))^n 2\delta \\ \Rightarrow && \lim_{\delta \to 0} \frac{\mathbb{P}(\text{median} < z+\delta \text{ and median} > z - \delta)}{2 \delta} &= \frac{(2n+1)!}{(n!)^2 (2c)^{2n+1}}((c-z)(c+z))^n \\ &&&= \frac{(2n+1)!}{(n!)^2 (2c)^{2n+1}}(c^2-z^2) \\ \end{align*} \begin{align*} && 1 &= \int_{-c}^c \frac{(2n+1)!}{(n!)^2 (2c)^{2n+1}}(c^2-z^2)^n \d z \\ \Rightarrow && \frac{(n!)^2 (2c)^{2n+1}}{(2n+1)!} &= \int_{-c}^c (c^2-z^2)^n \d z \end{align*} \begin{align*} \mathbb{E}(Z) &= \int_{-c}^c z \frac{(2n+1)!}{(n!)^2 (2c)^{2n+1}}(c^2-z^2)^n \d z \\ &=\frac{(2n+1)!}{(n!)^2 (2c)^{2n+1}} \int_{-c}^c z (c^2-z^2)^n \d z \\ &= 0 \end{align*} \begin{align*} \mathrm{Var}(Z) &= \mathbb{E}(Z^2) - \mathbb{E}(Z)^2 \\ &= \mathbb{E}(Z^2) \\ &= \int_{-c}^c z^2 \frac{(2n+1)!}{(n!)^2 (2c)^{2n+1}}(c^2-z^2)^n \d z \\ &=\frac{(2n+1)!}{(n!)^2 (2c)^{2n+1}} \int_{-c}^c z^2 (c^2-z^2)^n \d z \\ &=\frac{(2n+1)!}{(n!)^2 (2c)^{2n+1}} \left ( \left [ -\frac{1}{2(n+1)}z(c^2-z^2)^{n+1} \right]_{-c}^c + \frac{1}{2(n+1)}\int_{-c}^c (c^2-z^2)^{n+1} \d z \right) \\ &= \frac{(2n+1)!}{(n!)^2 (2c)^{2n+1}} \frac{1}{2(n+1)} \frac{((n+1)!)^2 (2c)^{2n+3}}{(2n+3)!} \\ &= \frac{(n+1)^2(2c)^2}{(n+1)(2n+2)(2n+3)} \\ &= \frac{2c^2}{2n+3} \end{align*}

1987 Paper 3 Q16
D: 1500.0 B: 1500.0

  1. \(X_{1},X_{2},\ldots,X_{n}\) are independent identically distributed random variables drawn from a uniform distribution on \([0,1].\) The random variables \(A\) and \(B\) are defined by \[ A=\min(X_{1},\ldots,X_{n}),\qquad B=\max(X_{1},\ldots,X_{n}). \] For any fixed \(k\), such that \(0< k< \frac{1}{2},\) let \[ p_{n}=\mathrm{P}(A\leqslant k\mbox{ and }B\geqslant1-k). \] What happens to \(p_{n}\) as \(n\rightarrow\infty\)? Comment briefly on this result.
  2. Lord Copper, the celebrated and imperious newspaper proprietor, has decided to run a lottery in which each of the \(4,000,000\) readers of his newspaper will have an equal probability \(p\) of winning \(\pounds 1,000,000\) and their changes of winning will be independent. He has fixed all the details leaving to you, his subordinate, only the task of choosing \(p\). If nobody wins \(\pounds 1,000,000\), you will be sacked, and if more than two readers win \(\pounds 1,000,000,\) you will also be sacked. Explaining your reasoning, show that however you choose \(p,\) you will have less than a 60\% change of keeping your job.


Solution:

  1. \begin{align*} && p_n &= \mathrm{P}(A\leqslant k\mbox{ and }B\geqslant1-k) \\ &&&= \mathrm{P}(A\leqslant k) +\P(B\geqslant1-k) - \mathrm{P}(A\leqslant k\mbox{ or }B\geqslant1-k)\\ &&&= 1-\mathrm{P}(A\geq k) +1-\P(B \leq 1-k) - \l 1- \mathrm{P}(A\geq k\mbox{ and }B\leq 1-k)\r\\ &&&= 1 - \P(X_i \geq k) - \P(X_i \leq 1-k) + \P(k \leq X_i \leq 1-k) \\ &&&= 1 - k^n - (1-k)^n + (1-2k)^n \end{align*} Therefore as \(n \to \infty\) \(p_n \to 1\), since \(k, (1-k), (1-2k)\) are all between \(0\) and \(1\) and so their powers will tend to \(0\).
  2. Let \(N = 4\,000\,000\). The probability exactly one person wins is \(Np(1-p)^{N-1}\). The probability exactly two people win is \(\binom{N}{2} p^2 (1-p)^{N-2}\). We wish to maximise the sum of these probabilities. To find this maximum, differentiate wrt \(p\). \begin{align*} \frac{\d}{\d p} : && \small N(1-p)^{N-1}-N(N-1)p(1-p)^{N-2} + N(N-1)p(1-p)^{N-2} - \frac12 N(N-1)(N-2)p^2(1-p)^{N-3} \\ &&= N(1-p)^{N-3} \l (1-p)^2 - \frac12(N-1)(N-2)p^2\r \\ \Rightarrow && \frac{(1-p)}{p} = \sqrt{\frac{(N-1)(N-2)}{2}} \\ \Rightarrow && p = \frac{1}{1+ \sqrt{\frac{(N-1)(N-2)}{2}}} \end{align*} This will be a maximum, since this is an increasing function at \(p=0\) and decreasing at \(p=1\) and there's only one stationary point. Note that \(p > \frac{\sqrt{2}}{(N-2)}\) and \(p < \frac{\sqrt{2}}{N-1+\sqrt{2}} < \frac{\sqrt{2}}{N}\) and so: \begin{align*} Np(1-p)^{N-1} &< \sqrt{2}(1-\frac{\sqrt{2}}{N-2})^{N-1} \\ &\approx \sqrt{2} e^{-\sqrt{2}} \end{align*} \begin{align*} \frac{N(N-1)}{2}p^2(1-p)^{N-2} &<(1-\frac{\sqrt{2}}{N-2})^{N-1} \\ &\approx e^{-\sqrt{2}} \end{align*} Alternatively, we can use a Poisson approximation. The number of winners is \(B(N, p)\) where we are hoping \(np\) is small but not zero. Therefore it's reasonable to approximation \(B(N,p)\) by \(Po(Np)\). (Call this value \(\lambda\)). Then we wish to maximise: \begin{align*} && p &= e^{-\lambda} \l \lambda + \frac{\lambda^2}{2} \r \\ &&&= e^{-\lambda} \lambda \l 1+ \frac{\lambda}{2} \r \\ \Rightarrow && \ln p &= -\lambda + \ln \lambda + \ln(1+\frac12 \lambda) \\ \frac{\d}{\d \lambda}: && \frac{p'}{p} &= -1 + \frac{1}{\lambda} + \frac{1}{2+\lambda} \\ &&&= \frac{-(2+\lambda)\lambda+2+2\lambda}{\lambda(2+\lambda)} \\ &&&= \frac{2-\lambda^2}{\lambda(2+\lambda)} \\ \Rightarrow && \lambda &= \sqrt{2} \end{align*} \begin{align*} \frac{\sqrt{2}+1}{e^{\sqrt{2}}} &< \frac{\sqrt{2}+1}{1+\sqrt{2}+1+\frac{1}{3}\sqrt{2}+\frac{1}{6}} \\ &= \frac{30\sqrt{2}-18}{41} \end{align*} Either way, we find we want to estimate \(e^{-\sqrt{2}}(1+\sqrt{2})\)