Problems

Filters
Clear Filters

22 problems found

1998 Paper 2 Q13
D: 1600.0 B: 1516.0

A random variable \(X\) has the probability density function \[ \mathrm{f}(x)=\begin{cases} \lambda\mathrm{e}^{-\lambda x} & x\geqslant0,\\ 0 & x<0. \end{cases} \] Show that $${\rm P}(X>s+t\,\vert X>t) = {\rm P}(X>s).$$ The time it takes an assistant to serve a customer in a certain shop is a random variable with the above distribution and the times for different customers are independent. If, when I enter the shop, the only two assistants are serving one customer each, what is the probability that these customers are both still being served at time \(t\) after I arrive? One of the assistants finishes serving his customer and immediately starts serving me. What is the probability that I am still being served when the other customer has finished being served?


Solution: \begin{align*} && \mathbb{P}(X > t) &= \int_t^{\infty} \lambda e^{-\lambda x} \d x\\ &&&= \left[ -e^{-\lambda x} \right]_t^\infty \\ &&&= e^{-\lambda t}\\ \\ && \mathbb{P}(X > s + t | X > t) &= \frac{\mathbb{P}(X > s + t)}{\mathbb{P}(X > t)} \\ &&&= \frac{e^{-(s+t)\lambda}}{e^{-t\lambda}} \\ &&&= e^{-s\lambda} = \mathbb{P}(X > s) \end{align*} The probability both are still being served (independently) is \(\mathbb{P}(X > t)^2 = e^{-2\lambda t}\). The probability is exactly \(\frac12\). The property we proved in the first part of the questions shows the distribution is memoryless, ie we are both experiencing samples from the same distribution. Therefore we are equally likely to finish first.

1997 Paper 3 Q13
D: 1700.0 B: 1500.0

Let \(X\) and \(Y\) be independent standard normal random variables: the probability density function, \(\f\), of each is therefore given by \[ \f(x)=\left(2\pi\right)^{-\frac{1}{2}}\e^{-\frac{1}{2}x^{2}}. \]

  1. Find the moment generating function \(\mathrm{E}(\e^{\theta X})\) of \(X\).
  2. Find the moment generating function of \(aX+bY\) and hence obtain the condition on \(a\) and \(b\) which ensures that \(aX+bY\) has the same distribution as \(X\) and \(Y\).
  3. Let \(Z=\e^{\mu+\sigma X}\). Show that \[ \mathrm{E}(Z^{\theta})=\e^{\mu\theta+\frac{1}{2}\sigma^{2}\theta^{2}}, \] and hence find the expectation and variance of \(Z\).


Solution:

  1. \(\,\) \begin{align*} && \E[e^{\theta X}] &= \int_{-\infty}^{\infty} e^{\theta x} \frac{1}{\sqrt{2\pi}} e^{-\frac12 x^2 } \d x\\ &&&= \int_{-\infty}^{\infty} \frac{1}{\sqrt{2\pi}} e^{-\frac12 x^2+\theta x} \d x\\ &&&= \int_{-\infty}^{\infty} \frac{1}{\sqrt{2\pi}} e^{-\frac12 (x^2-2\theta x)} \d x\\ &&&= \int_{-\infty}^{\infty} \frac{1}{\sqrt{2\pi}} e^{-\frac12 (x-\theta )^2+\frac12\theta^2 } \d x\\ &&&= e^{\frac12\theta^2 }\int_{-\infty}^{\infty} \frac{1}{\sqrt{2\pi}} e^{-\frac12 (x-\theta )^2 } \d x\\ &&&=e^{\frac12\theta^2 } \end{align*}
  2. \begin{align*} && M_{aX+bY} (\theta) &= \mathbb{E}[e^{\theta (aX+bY)}] \\ &&&= e^{\frac12(a\theta)^2} \cdot e^{\frac12(b\theta)^2} \\ &&&= e^{\frac12(a^2+b^2)\theta^2} \end{align*} Therefore we need \(a^2+b^2 = 1\)
  3. \(\,\) \begin{align*} && \E[Z^\theta] &= \E[e^{\mu \theta + \sigma \theta X}] \\ &&&= e^{\mu \theta}e^{\frac12 \sigma^2 \theta^2} \\ &&&=e^{\mu \theta + \frac12 \sigma^2 \theta^2} \\ \end{align*} \begin{align*} \mathbb{E}(Z) &= \mathbb{E}[Z^1] \\ &= e^{\mu + \frac12 \sigma^2} \\ \var[Z] &= \E[Z^2] - \left ( \E[Z] \right)^2 \\ &= e^{2 \mu+ 2\sigma^2} - e^{2\mu + \sigma^2} \\ &= e^{2\mu+\sigma^2} \left (e^{\sigma^2}-1 \right) \end{align*} [NB: This is the lognormal distribution]

1997 Paper 3 Q14
D: 1700.0 B: 1516.0

An industrial process produces rectangular plates of mean length \(\mu_{1}\) and mean breadth \(\mu_{2}\). The length and breadth vary independently with non-zero standard deviations \(\sigma_{1}\) and \(\sigma_{2}\) respectively. Find the means and standard deviations of the perimeter and of the area of the plates. Show that the perimeter and area are not independent.


Solution: Let \(L \sim N(\mu_1, \sigma_1^2)\), \(B \sim N(\mu_2, \sigma_2)^2\), so \begin{align*} && \mathbb{E}(\text{perimeter}) &= \E(2(L+B)) \\ &&&= 2\E[L]+2\E[B] \\ &&&= 2(\mu_1+\mu_2) \\ &&\var[\text{perimeter}] &= \E\left [ (2(L+B))^2 \right] - \left ( \E[2(L+B)] \right)^2 \\ &&&= 4\E[L^2+2LB+B^2] - 4(\mu_1+\mu_2)^2 \\ &&&= 4(\sigma_1^2+\mu_1^2+2\mu_1\mu_2+\sigma_2^2+\mu_2^2) - 4(\mu_1+\mu_2)^2\\ &&&= 4(\sigma_1^2+\sigma_2^2) \\ &&\text{sd}[\text{perimeter}] &= 2\sqrt{\sigma_1^2+\sigma_2^2} \\ \\ && \E[\text{area}] &= \E[LB] \\ &&&= \E[L]\E[B] \\ &&&= \mu_1\mu_2 \\ && \var[\text{area}] &= \E[(LB)^2] - \left (\E[LB] \right)^2 \\ &&&= \E[L^2]\E[B^2]-\mu_1^2\mu_2^2 \\ &&&= (\mu_1^2+\sigma_1^2)(\mu_2^2+\sigma_2^2) -\mu_1^2\mu_2^2 \\ &&&= \sigma_1^2\mu_2^2 + \sigma_2^2\mu_1^2 + \sigma_1^2\sigma_2^2\\ && \text{sd}(\text{area}) &= \sqrt{\sigma_1^2\mu_2^2 + \sigma_2^2\mu_1^2 + \sigma_1^2\sigma_2^2} \\ \\ && \E[\text{perimeter} \cdot \text{area}] &= \E[2(L+B)LB] \\ &&&= 2\E[L^2]\E[B] + 2\E[L]\E[B^2] \\ &&&= 2(\sigma_1^2+\mu_1^2)\mu_2 + 2(\sigma_2^2+\mu_2^2)\mu_1 \\ && \E[\text{perimeter}] \E[\text{area}] &= 2(\mu_1+\mu_2) \cdot \mu_1\mu_2 \end{align*} Since the latter does not depend on \(\sigma_i\) but the former does they cannot be equal in general, therefore they cannot be independent. [See also STEP 2006 Paper 3 Q14]

1994 Paper 1 Q14
D: 1500.0 B: 1532.7

Each of my \(n\) students has to hand in an essay to me. Let \(T_{i}\) be the time at which the \(i\)th essay is handed in and suppose that \(T_{1},T_{2},\ldots,T_{n}\) are independent, each with probability density function \(\lambda\mathrm{e}^{-\lambda t}\) (\(t\geqslant0\)). Let \(T\) be the time I receive the first essay to be handed in and let \(U\) be the time I receive the last one.

  1. Find the mean and variance of \(T_{i}.\)
  2. Show that \(\mathrm{P}(U\leqslant u)=(1-\mathrm{e}^{-\lambda u})^{n}\) for \(u\geqslant0,\) and hence find the probability density function of \(U\).
  3. Obtain \(\mathrm{P}(T>t),\) and hence find the probability density function of \(T\).
  4. Write down the mean and variance of \(T\).


Solution:

  1. \(T_i \sim \textrm{Exp}(\lambda)\) so \(\E[T_i] = \lambda^{-1}, \var[T_i] = \lambda^{-2}\)
  2. \(\,\) \begin{align*} && \mathbb{P}(U \leq u) &= \mathbb{P}(T_i \leq u\quad \forall i) \\ &&&= \prod \mathbb{P}(T_i \leq u) \\ &&&= \prod \int_0^u \lambda e^{-\lambda t} \d t \\ &&&= (1-e^{-\lambda u})^n \\ \\ \Rightarrow && f_U(u) &= n\lambda e^{-\lambda u}(1-e^{-\lambda u})^{n-1} \end{align*}
  3. \(\,\) \begin{align*} && \mathbb{P}(T > t) &= \mathbb{P}(T_i > t \quad \forall i) \\ &&&= \prod \mathbb{P}(T_i > t) \\ &&&= e^{-n\lambda t} \\ \Rightarrow && f_T(t) &= n\lambda e^{-n\lambda t} \end{align*}
  4. Therefore \(\E[T] = \frac{1}{n\lambda}, \var[T] = \frac{1}{(n\lambda)^2}\)

1993 Paper 3 Q16
D: 1700.0 B: 1484.9

The time taken for me to set an acceptable examination question it \(T\) hours. The distribution of \(T\) is a truncated normal distribution with probability density \(\f\) where \[ \mathrm{f}(t)=\begin{cases} \dfrac{1}{k\sigma\sqrt{2\pi}}\exp\left(-\dfrac{1}{2}\left(\dfrac{t-\sigma}{\sigma}\right)^{2}\right) & \mbox{ for }t\geqslant0\\ 0 & \mbox{ for }t<0. \end{cases} \] Sketch the graph of \(\f(t)\). Show that \(k\) is approximately \(0.841\) and obtain the mean of \(T\) as a multiple of \(\sigma\). Over a period of years, I find that the mean setting time is 3 hours.

  1. Find the approximate probability that none of the 16 questions on next year's paper will take more than 4 hours to set.
  2. Given that a particular question is unsatisfactory after 2 hours work, find the probability that it will still be unacceptable after a further 2 hours work.

1991 Paper 1 Q16
D: 1500.0 B: 1484.0

At any instant the probability that it is safe to cross a busy road is \(0.1\). A toad is waiting to cross this road. Every minute she looks at the road. If it is safe, she will cross; if it is not safe, she will wait for a minute before attempting to cross again. Find the probability that she eventually crosses the road without mishap. Later on, a frog is also trying to cross the same road. He also inspects the traffic at one minute intervals and crosses if it is safe. Being more impatient than the toad, he may also attempt to cross when it is not safe. The probability that he will attempt to cross when it is not safe is \(n/3\) if \(n\leqslant3,\) where \(n\) minutes have elapsed since he firrst inspected the road. If he attempts to cross when it is not safe, he is run over with probability \(0.8,\) but otherwise he reaches the other side safely. Find the probability that he eventually crosses the road without mishap. What is the probability that both reptiles safely cross the road with the frog taking less time than the toad? If the frog has not arrived at the other side 2 minutes after he began his attempt to cross, what is the probability that the frog is run over (at some stage) in his attempt to cross? \textit{[Once moving, the reptiles spend a negligible time on their attempt to cross the road.]}


Solution: Since the toad never crosses when it's not safe, she is certain to cross. (Probability she hasn't crossed after the \(n\)th minute is \(0.9^n \to 0\)). \begin{array}{c|c|c|c|c|c|c|c} \text{will try dangerously} & \text{is safe} & \text{has tried} & \text{tries safely} & \text{tries unsafely} & \text{succeeds} & \text{succeeds unsafely} & \text{fails} \\ \hline 0 & 0.1 & 0 & 0.1 & 0 & 0.1 & 0 & 0\\ \frac13 & 0.1 & 0.1 & 0.09 & 0.27 & 0.144 & 0.054 & 0.216\\ \frac23 & 0.1 & 0.46 & 0.054 & 0.324 & 0.1188 & 0.0648 & 0.2592\\ 1 & 0.1 & 0.838 & 0.0162 & 0.1458 & 0.04536 & 0.02916 & 0.11664\\ \hline & & & & & 0.40816 & 0.14796 & \\ \hline \end{array} So \(\mathbb{P}(\text{frog crosses safely}) = 0.40816\) and \(\mathbb{P}(\text{frog beats toad across}) = 0.14796\). \begin{align*} \mathbb{P}(\text{frog run over} | \text{frog not crossed after 2 minutes}) &= \frac{\mathbb{P}(\text{frog run over and frog not crossed after 2 minutes})}{\mathbb{P}(\text{frog not crossed after 2 minutes})} \\ &= \frac{\mathbb{P}(\text{frog run over within 2 minutes})}{\mathbb{P}(\text{frog not crossed after 2 minutes})} \\ &= \frac{\mathbb{P}(\text{frog run over within 2 minutes})}{1-\mathbb{P}(\text{crossed after 2 minutes})} \\ &= \frac{0.216+0.2592}{1-0.3628} \\ &= 0.7457\ldots \end{align*}

1990 Paper 3 Q15
D: 1700.0 B: 1482.6

An unbiased twelve-sided die has its faces marked \(A,A,A,B,B,B,B,B,B,B,B,B.\) In a series of throws of the die the first \(M\) throws show \(A,\) the next \(N\) throws show \(B\) and the \((M+N+1)\)th throw shows \(A\). Write down the probability that \(M=m\) and \(N=n\), where \(m\geqslant0\) and \(n\geqslant1.\) Find

  1. the marginal distributions of \(M\) and \(N\),
  2. the mean values of \(M\) and \(N\).
Investigate whether \(M\) and \(N\) are independent. Find the probability that \(N\) is greater than a given integer \(k\), where \(k\geqslant1,\) and find \(\mathrm{P}(N > M).\) Find also \(\mathrm{P}(N=M)\) and show that \(\mathrm{P}(N < M)=\frac{1}{52}.\)


Solution: \begin{align*} \mathbb{P}(M = m, N = n) &= \left ( \frac{3}{12} \right)^m \left ( \frac{9}{12} \right)^n \frac{3}{12} \\ &= \frac{3^n}{4^{m+n+1}} \end{align*}

  1. \begin{align*} \mathbb{P}(M = m) &= \sum_{n = 1}^{\infty} \mathbb{P}(M=m,N=n) \\ &= \sum_{n = 1}^{\infty} \frac{3^n}{4^{m+n+1}} \\ &= \frac{1}{4^{m+1}} \sum_{n = 1}^{\infty} \left ( \frac34\right)^n \\ &= \frac{1}{4^{m+1}} \frac{3/4}{1/4} \\ &= \frac{3}{4^{m+1}} \\ \\ \mathbb{P}(N = n) &= \sum_{m = 0}^{\infty} \mathbb{P}(M=m,N=n) \\ &= \sum_{m = 0}^{\infty} \frac{3^n}{4^{m+n+1}} \\ &= \frac{3^n}{4^{n+1}} \sum_{m = 0}^{\infty} \left ( \frac14\right)^n \\ &= \frac{3^n}{4^{n+1}} \frac{1}{3/4} \\ &= \frac{3^{n-1}}{4^{n}} \\ \end{align*}
  2. \(M+1 \sim Geo(\frac34) \Rightarrow \mathbb{E}(M) = \frac43 -1 = \frac13\) \(N \sim Geo(\frac14) \Rightarrow \mathbb{E}(N) = 4\)
\(M,N\) are independent since \(\mathbb{P}(M = m, N =n ) = \mathbb{P}(M=m)\mathbb{P}(N=n)\) \begin{align*} \mathbb{P}(N > k) &= \sum_{n=k+1}^{\infty} \mathbb{P}(N = n) \\ &= \sum_{n=k+1}^{\infty} \frac{3^{n-1}}{4^{n}} \\ &= \frac{3^k}{4^{k+1}} \sum_{n = 0}^{\infty} \left ( \frac34\right)^n \\ &= \frac{3^k}{4^{k+1}} \frac{1}{1/4} \\ &= \frac{3^k}{4^k} \end{align*} \begin{align*} \mathbb{P}(N > M) &= \sum_{m=0}^{\infty} \mathbb{P}(N > m) \mathbb{P}(M = m) \\ &= \sum_{m=0}^{\infty} \left (\frac34 \right)^m \frac{3}{4^{m+1}}\\ &=\sum_{m=0}^{\infty} \frac{3^{m+1}}{4^{2m+1}}\\ &= \frac{3}{4} \frac{1}{13/16} \\ &= \frac{12}{13} \\ \\ \mathbb{P}(N=M) &= \sum_{m=1}^{\infty} \mathbb{P}(N=m, M=m) \\ &= \sum_{m=1}^{\infty} \frac{3^m}{4^{2m+1}} \\ &= \frac{3}{64} \sum_{m=0}^{\infty} \left ( \frac{3}{16} \right)^m \\ &= \frac{3}{64} \frac{1}{13/16} \\ &= \frac{3}{52}\\ \\ \mathbb{P}(N < M) &= 1 - \frac34 - \frac3{52} \\ &= 1 - \frac{48}{52} - \frac{3}{52} \\ &= 1 - \frac{51}{52} \\ &= \frac{1}{52} \end{align*}