Problems

Filters
Clear Filters

71 problems found

2001 Paper 3 Q12
D: 1700.0 B: 1518.2

A bag contains \(b\) black balls and \(w\) white balls. Balls are drawn at random from the bag and when a white ball is drawn it is put aside.

  1. If the black balls drawn are also put aside, find an expression for the expected number of black balls that have been drawn when the last white ball is removed.
  2. If instead the black balls drawn are put back into the bag, prove that the expected number of times a black ball has been drawn when the first white ball is removed is \(b/w\,\). Hence write down, in the form of a sum, an expression for the expected number of times a black ball has been drawn when the last white ball is removed.

2000 Paper 1 Q14
D: 1484.0 B: 1528.4

The random variable \(X\) is uniformly distributed on the interval \([-1,1]\). Find \(\E(X^2)\) and \(\var (X^2)\). A second random variable \(Y\), independent of \(X\), is also uniformly distributed on \([-1,1]\), and \(Z=Y-X\). Find \(\E(Z^2)\) and show that \(\var (Z^2) = 7 \var (X^2)\).


Solution: \(X \sim U(-1,1)\) \begin{align*} \E[X^2] &= \int_{-1}^1 \frac12 x^2 \, dx \\ &= \frac{1}{6} \left [ x^3 \right]_{-1}^1 \\ &= \frac{1}{3} \end{align*} \begin{align*} \E[X^4] &= \int_{-1}^1 \frac12 x^4 \, dx \\ &= \frac{1}{10} \left [ x^5 \right]_{-1}^1 \\ &= \frac{1}{5} \end{align*} \begin{align*} \var[X^2] &=\E[X^4] - \E[X^2]^2 \\ &= \frac{1}{5} - \frac{1}{9} \\ &= \frac{4}{45} \end{align*} \begin{align*} \E(Z^2) &= \E(Y^2 - 2XY+Z^2) \\ &= \E(Y^2) - 2\E(X)\E(Y)+\E(Z^2) \\ &= \frac{1}{3} - 0 + \frac{1}{3} \\ &= \frac{2}{3} \end{align*} \begin{align*} \E[Z^4] &= \E[Y^4 -4Y^3X+6Y^2X^2-4YX^3+X^4] \\ &= \E[Y^4]-4\E[Y^3]\E[X]+6\E[Y^2]\E[X^2]-4\E[Y]\E[X^3]+\E[X^4] \\ &= \frac{1}{5}+6 \frac{1}{3} \frac13 + \frac{1}{5} \\ &= \frac{2}{5} + \frac{2}{3} \\ &= \frac{16}{15} \end{align*} \begin{align*} \var(Z^2) &= \E(Z^4) - \E(Z^2) \\ &= \frac{16}{15} - \frac{4}{9} \\ &= \frac{28}{45} \\ &= 7 \var(X^2) \end{align*}

2000 Paper 2 Q14
D: 1600.0 B: 1484.0

The random variables \(X_1\), \(X_2\), \(\ldots\) , \(X_{2n+1}\) are independently and uniformly distributed on the interval \(0 \le x \le 1\). The random variable \(Y\) is defined to be the median of \(X_1\), \(X_2\), \(\ldots\) , \(X_{2n+1}\). Given that the probability density function of \(Y\) is \(\g(y)\), where \[ \mathrm{g}(y)=\begin{cases} ky^{n}(1-y)^{n} & \mbox{ if }0\leqslant y\leqslant1\\ 0 & \mbox{ otherwise} \end{cases} \] use the result $$ \int_0^1 {y^{r}}{{(1-y)}^{s}}\,\d y = \frac{r!s!}{(r+s+1)!} $$ to show that \(k={(2n+1)!}/{{(n!)}^2}\), and evaluate \(\E(Y)\) and \({\rm Var}\,(Y)\). Hence show that, for any given positive number \(d\), the inequality $$ {\P\left({\vert {Y - 1/2} \vert} < {d/{\sqrt {n}}} \right)} < {\P\left({\vert {{\bar X} - 1/2} \vert} < {d/{\sqrt {n}}} \right)} $$ holds provided \(n\) is large enough, where \({\bar X}\) is the mean of \(X_1\), \(X_2\), \(\ldots\) , \(X_{2n+1}\). [You may assume that \(Y\) and \(\bar X\) are normally distributed for large \(n\).]

2000 Paper 3 Q13
D: 1700.0 B: 1516.0

A set of \(n\) dice is rolled repeatedly. For each die the probability of showing a six is \(p\). Show that the probability that the first of the dice to show a six does so on the \(r\)th roll is $$q^{n r } ( q^{-n} - 1 )$$ where \(q = 1 - p\). Determine, and simplify, an expression for the probability generating function for this distribution, in terms of \(q\) and \(n\). The first of the dice to show a six does so on the \(R\)th roll. Find the expected value of \(R\) and show that, in the case \(n = 2\), \(p=1/6\), this value is \(36/11\). Show that the probability that the last of the dice to show a six does so on the \(r\)th roll is \[ \big(1-q^r\big)^n-\big(1-q^{r-1}\big)^n. \] Find, for the case \(n = 2\), the probability generating function. The last of the dice to show a six does so on the \(S\)th roll. Find the expected value of \(S\) and evaluate this when \(p=1/6\).

2000 Paper 3 Q14
D: 1700.0 B: 1500.0

The random variable \(X\) takes only the values \(x_1\) and \(x_2\) (where \( x_1 \not= x_2 \)), and the random variable \(Y\) takes only the values \(y_1\) and \(y_2\) (where \(y_1 \not= y_2\)). Their joint distribution is given by $$ \P ( X = x_1 , Y = y_1 ) = a \ ; \ \ \P ( X = x_1 , Y = y_2 ) = q - a \ ; \ \ \P ( X = x_2 , Y = y_1 ) = p - a \ . $$ Show that if \(\E(X Y) = \E(X)\E(Y)\) then $$ (a - p q ) ( x_1 - x_2 ) ( y_1 - y_2 ) = 0 . $$ Hence show that two random variables each taking only two distinct values are independent if \(\E(X Y) = \E(X) \E(Y)\). Give a joint distribution for two random variables \(A\) and \(B\), each taking the three values \(- 1\), \(0\) and \(1\) with probability \({1 \over 3}\), which have \(\E(A B) = \E( A)\E (B)\), but which are not independent.


Solution: \begin{align*} \mathbb{P}(X = x_1) &= a + q - a = q \\ \mathbb{P}(X = x_2) &= 1 - q \\ \mathbb{P}(Y = y_1) & = a + p - a = p \\ \mathbb{P}(Y = y_2) & = 1 - p \end{align*} \begin{align*} \mathbb{E}(X)\mathbb{E}(Y) &= \l qx_1 + (1-q)x_2 \r \l p y_1 + (1-p)y_2\r \\ &= qpx_1y_1 + q(1-p)x_1y_2 + (1-q)px_2y_1 + (1-q)(1-p)x_2y_2 \\ \mathbb{E}(XY) &= ax_1y_1 + (q-a)x_1y_2 + (p-a)x_2y_1 + (1 + a - p - q)x_2y_2 &= \end{align*} Therefore \(\mathbb{E}(XY) - \mathbb{E}(X)\mathbb{E}(Y)\) is a degree 2 polynomial in the \(x_i, y_i\). If \(x_1 = x_2\) then we have: \begin{align*} \mathbb{E}(X)\mathbb{E}(Y) &=x_1 \l p y_1 + (1-p)y_2\r \\ \mathbb{E}(XY) &= x_1(ay_1 + (q-a)y_2 + (p-a)y_1 + (1 + a - p - q)y_2) \\ &= x_1 (py_1 + (1-p)y_2) \end{align*} Therefore \(x_1 - x_2\) is a root and by symmetry \(y_1 - y_2\) is a root. Therefore it remains to check the coefficient of \(x_1y_1\) which is \(a - pq\) to complete the factorisation. For any two random variables taking two distinct values, we can find \(a, q, p\) satisfying the relations above. We also note that \(X\) and \(Y\) are independent if \(\mathbb{P}(X = x_i, Y = y_i) = \mathbb{P}(X = x_i)\mathbb{P}(Y = y_i)\). Since \(x_1 \neq x_2\) and \(y_1 \neq y_2\) and \(\E(A B) = \E( A)\E (B) \Rightarrow a = pq\). But if \(a = pq\), we have \(\mathbb{P}(X = x_1, Y = y_1) = \mathbb{P}(X = x_1)\mathbb{P}(Y = y_1)\) and all the other relations drop out similarly. Consider \begin{align*} \mathbb{P}(A = -1, B = 1) &= \frac{1}{6} \\ \mathbb{P}(A = -1, B = -1) &= \frac{1}{6} \\ \mathbb{P}(A = 0, B = 0) &= \frac{1}{3} \\ \mathbb{P}(A = 1, B = -1) &= \frac{1}{6} \\ \mathbb{P}(A = -1, B = -1) &= \frac{1}{6} \end{align*}

1999 Paper 1 Q13
D: 1500.0 B: 1484.0

Bar magnets are placed randomly end-to-end in a straight line. If adjacent magnets have ends of opposite polarities facing each other, they join together to form a single unit. If they have ends of the same polarity facing each other, they stand apart. Find the expectation and variance of the number of separate units in terms of the total number \(N\) of magnets.


Solution: There are \(N-1\) gaps between the magnets which are independently gaps or not gaps. Therefore the total number of gaps is \(X \sim Binomial(N-1, \frac12)\) and \begin{align*} \mathbb{E}(X) &= \frac{N-1}{2} \\ \textrm{Var}(X) &= \frac{N-1}{4} \end{align*}

1999 Paper 3 Q12
D: 1700.0 B: 1500.0

In the game of endless cricket the scores \(X\) and \(Y\) of the two sides are such that \[ \P (X=j,\ Y=k)=\e^{-1}\frac{(j+k)\lambda^{j+k}}{j!k!},\] for some positive constant \(\lambda\), where \(j,k = 0\), \(1\), \(2\), \(\ldots\).

  1. Find \(\P(X+Y=n)\) for each \(n>0\).
  2. Show that \(2\lambda \e^{2\lambda-1}=1\).
  3. Show that \(2x \e^{2x-1}\) is an increasing function of \(x\) for \(x>0\) and deduce that the equation in (ii) has at most one solution and hence determine \(\lambda\).
  4. Calculate the expectation \(\E(2^{X+Y})\).


Solution:

  1. \begin{align*} && \mathbb{P}(X+Y = n) &= \sum_{i = 0}^n \mathbb{P}(X = i, Y = n-i) \\ &&&= \sum_{i = 0}^n e^{-1} \frac{n \lambda^n}{i! (n-i)!} \\ &&&=e^{-1} n \lambda^n \sum_{i = 0}^n\frac{1}{i! (n-i)!} \\ &&&=\frac{e^{-1} n}{n!} \lambda^n \sum_{i = 0}^n\frac{n!}{i! (n-i)!} \\ &&&= \frac{n\lambda^n}{e n!} 2^n \\ &&&= \frac{n (2 \lambda)^n}{e \cdot n!} \end{align*}
  2. \begin{align*} && 1 &= \sum_{n = 0}^{\infty} \mathbb{P}(X+Y =n ) \\ &&&= \sum_{n = 0}^{\infty}\frac{n (2 \lambda)^n}{e \cdot n!} \\ &&&= \sum_{n = 1}^\infty \frac{ (2 \lambda)^n}{e \cdot (n-1)!} \\ &&&= \frac{2 \lambda}{e}\sum_{n = 0}^\infty \frac{ (2 \lambda)^n}{n!} \\ &&&= \frac{2 \lambda}{e} e^{2\lambda} \\ &&&= 2 \lambda e^{2\lambda - 1} \end{align*} \\
  3. Consider \(f(x) = 2xe^{2x-1}\), then \begin{align*} && f'(x) &= 2e^{2x-1} + 2xe^{2x-1} \cdot 2 \\ &&&= e^{2x-1} (2 + 4x) > 0 \end{align*} Therefore \(f(x)\) is an increasing function of \(x\), which means \(f(x) = 1\) has at most one solution for \(\lambda\). Therefore \(\lambda = \frac12\)
  4. \begin{align*} \mathbb{E}(2^{X+Y}) &= \sum_{n = 0}^\infty \mathbb{P}(X+Y = n) 2^n \\ &= \sum_{n = 1}^\infty \frac{1}{e(n-1)!} 2^{n} \\ &= \frac{2}{e} \sum_{n=0}^\infty \frac{2^n}{n!} \\ &= \frac{2}{e} e^2 \\ &= 2e \end{align*}

1998 Paper 2 Q14
D: 1600.0 B: 1500.0

The staff of Catastrophe College are paid a salary of \(A\) pounds per year. With a Teaching Assessment Exercise impending it is decided to try to lower the student failure rate by offering each lecturer an alternative salary of \(B/(1+X)\) pounds, where \(X\) is the number of his or her students who fail the end of year examination. Dr Doom has \(N\) students, each with independent probability \(p\) of failure. Show that she should accept the new salary scheme if $$A(N+1)p < B(1-(1-p)^{N+1}).$$ Under what circumstances could \(X\), for Dr Doom, be modelled by a Poisson random variable? What would Dr Doom's expected salary be under this model?


Solution: \begin{align*} && \E[\text{salary}] &= B\sum_{k=0}^N \frac{1}{1+k}\binom{N}{k}p^k(1-p)^{N-k} \\ \\ && (q+x)^N &= \sum_{k=0}^N \binom{N}{k}x^kq^{N-k} \\ \Rightarrow && \int_0^p(q+x)^N \d x &= \sum_{k=0}^N \binom{N}{k} \frac{p^{k+1}}{k+1}q^{N-k} \\ && \frac{(q+p)^{N+1}-q^{N+1}}{N+1} &= \frac{p}{B} \E[\text{salary}] \\ \Rightarrow && \E[\text{salary}] &= B\frac{1-q^{N+1}}{p(N+1)} \end{align*} Therefore if \(Ap(N+1) < B(1-(1-p)^{N+1})\) the expected value of the new salary is higher. (Whether or not the new salary is worth it in a risk adjusted sense is for the birds). We could model \(X\) by a Poisson random variable if \(N\) is large and \(Np = \lambda \) is small. Suppose \(X \approx Po(\lambda)\) then \begin{align*} \E \left [\frac{B}{1+X} \right] &= B\sum_{k=0}^\infty \frac{1}{1+k}\frac{e^{-\lambda}\lambda^k}{k!} \\ &= \frac{B}{\lambda} \sum_{k=0}^\infty e^{-\lambda} \frac{\lambda^{k+1}}{(k+1)!} \\ &= \frac{B}{\lambda}e^{-\lambda}(e^{\lambda}-1) \\ &= \frac{B(1-e^{-\lambda})}{\lambda} = B \frac{1-e^{-Np}}{Np} \end{align*}

1997 Paper 1 Q12
D: 1500.0 B: 1500.0

An experiment produces a random number \(T\) uniformly distributed on \([0,1]\). Let \(X\) be the larger root of the equation \[x^{2}+2x+T=0.\] What is the probability that \(X>-1/3\)? Find \(\mathbb{E}(X)\) and show that \(\mathrm{Var}(X)=1/18\). The experiment is repeated independently 800 times generating the larger roots \(X_{1}, X_{2}, \dots, X_{800}\). If \[Y=X_{1}+X_{2}+\dots+X_{800}.\] find an approximate value for \(K\) such that \[\mathrm{P}(Y\leqslant K)=0.08.\]


Solution: \((x+1)^2+T-1 = 0\) so the larger root is \(-1 + \sqrt{1-T}\) \begin{align*} && \mathbb{P}(X > -1/3) &= \mathbb{P}(-1 + \sqrt{1-T} > -1/3) \\ &&&= \mathbb{P}(\sqrt{1-T} > 2/3)\\ &&&= \mathbb{P}(1-T > 4/9)\\ &&&= \mathbb{P}\left (T < \frac59 \right) = \frac59 \end{align*} Similarly, for \(t \in [-1,0]\) \begin{align*} && \mathbb{P}(X \leq t) &= \mathbb{P}(-1 + \sqrt{1-T} \leq t) \\ &&&= \mathbb{P}(\sqrt{1-T} \leq t+1)\\ &&&= \mathbb{P}(1-T \leq (t+1)^2)\\ &&&= \mathbb{P}\left (T \geq 1-(t+1)^2\right) = (t+1)^2 \\ \Rightarrow && f_X(t) &= 2(t+1) \\ \Rightarrow && \E[X] &= \int_{-1}^0 x \cdot f_X(x) \d x \\ &&&= \int_{-1}^0 x2(x+1) \d x \\ &&&= \left [\frac23x^3+x^2 \right]_{-1}^0 \\ &&&= -\frac13 \\ && \E[X^2] &= \int_{-1}^0 x^2 \cdot f_X(x) \d x \\ &&&= \int_{-1}^0 2x^2(x+1) \d x \\ &&&= \left [ \frac12 x^4 + \frac23x^3\right]_{-1}^0 \\ &&&= \frac16 \\ \Rightarrow && \var[X] &= \E[X^2] - \left (\E[X] \right)^2 \\ &&&= \frac16 - \frac19 = \frac1{18} \end{align*} Notice that by the central limit theorem \(\frac{Y}{800} \approx N( -\tfrac13, \frac{1}{18 \cdot 800})\). Also notice that \(\Phi^{-1}(0.08) \approx -1.4 \approx -\sqrt{2}\) Therefore we are looking for roughly \(800 \cdot (-\frac13 -\frac{1}{\sqrt{18 \cdot 800}} \sqrt{2})) = -267-9 = -276\)

1997 Paper 1 Q14
D: 1484.0 B: 1484.0

The maximum height \(X\) of flood water each year on a certain river is a random variable with density function \begin{equation*} {\mathrm f}(x)= \begin{cases} \exp(-x)&\text{if \(x\geqslant 0\),}\\ 0&\text{otherwise}. \end{cases} \end{equation*} It costs \(y\) megadollars each year to prepare for flood water of height \(y\) or less. If \(X\leqslant y\) no further costs are incurred but if \(X\geqslant y\) the cost of flood damage is \(r+s(X-y)\) megadollars where \(r,s>0\). The total cost \(T\) megadollars is thus given by \begin{equation*} T= \begin{cases} y&\text{if \(X\leqslant y\)},\\ y+r+s(X-y)&\text{if \(X>y\)}. \end{cases} \end{equation*} Show that we can minimise the expected total cost by taking \[y=\ln(r+s).\]

1997 Paper 3 Q12
D: 1700.0 B: 1500.0

  1. I toss a biased coin which has a probability \(p\) of landing heads and a probability \(q=1-p\) of landing tails. Let \(K\) be the number of tosses required to obtain the first head and let \[ \mathrm{G}(s)=\sum_{k=1}^{\infty}\mathrm{P}(K=k)s^{k}. \] Show that \[ \mathrm{G}(s)=\frac{ps}{1-qs} \] and hence find the expectation and variance of \(K\).
  2. I sample cards at random with replacement from a normal pack of \(52\). Let \(N\) be the total number of draws I make in order to sample every card at least once. By expressing \(N\) as a sum \(N=N_{1}+N_{2}+\cdots+N_{52}\) of random variables, or otherwise, find the expectation of \(N\). Estimate the numerical value of this expectation, using the approximations \(\mathrm{e}\approx2.7\) and \(1+\frac{1}{2}+\frac{1}{3}+\cdots+\frac{1}{n}\approx0.5+\ln n\) if \(n\) is large.


Solution:

  1. Let \(N_i\) be the number of draws between the \((i-1)\)th new card and the \(i\)th new card. (Where \(N_1 = 1\)0 then \(N_i \sim K\) with \(p = \frac{53-i}{52}\)). Therefore \begin{align*} \E[N] &= \E[N_1 + \cdots + N_{52}] \\ &= \E[N_1] + \cdots + \E[N_i] + \cdots + \E[N_{52}] \\ &= 1 + \frac{52}{51} + \cdots + \frac{52}{53-k} + \cdots + \frac{52}{1} \\ &= 52 \left (1 + \frac{1}{2} + \cdots + \frac{1}{52} \right) \\ &= 52 \cdot \left ( 1 + \ln 52 \right) \end{align*} Notice that \(2.7 \times 2.7 = 7.29\) and \(7.3 \times 7.3 \approx 53.3\) so \(\ln 52 \approx 4\) and so our number is \(\approx 52 \cdot 4.5 =234\). [The correct answer actual number is 235.9782]

1997 Paper 3 Q13
D: 1700.0 B: 1500.0

Let \(X\) and \(Y\) be independent standard normal random variables: the probability density function, \(\f\), of each is therefore given by \[ \f(x)=\left(2\pi\right)^{-\frac{1}{2}}\e^{-\frac{1}{2}x^{2}}. \]

  1. Find the moment generating function \(\mathrm{E}(\e^{\theta X})\) of \(X\).
  2. Find the moment generating function of \(aX+bY\) and hence obtain the condition on \(a\) and \(b\) which ensures that \(aX+bY\) has the same distribution as \(X\) and \(Y\).
  3. Let \(Z=\e^{\mu+\sigma X}\). Show that \[ \mathrm{E}(Z^{\theta})=\e^{\mu\theta+\frac{1}{2}\sigma^{2}\theta^{2}}, \] and hence find the expectation and variance of \(Z\).


Solution:

  1. \(\,\) \begin{align*} && \E[e^{\theta X}] &= \int_{-\infty}^{\infty} e^{\theta x} \frac{1}{\sqrt{2\pi}} e^{-\frac12 x^2 } \d x\\ &&&= \int_{-\infty}^{\infty} \frac{1}{\sqrt{2\pi}} e^{-\frac12 x^2+\theta x} \d x\\ &&&= \int_{-\infty}^{\infty} \frac{1}{\sqrt{2\pi}} e^{-\frac12 (x^2-2\theta x)} \d x\\ &&&= \int_{-\infty}^{\infty} \frac{1}{\sqrt{2\pi}} e^{-\frac12 (x-\theta )^2+\frac12\theta^2 } \d x\\ &&&= e^{\frac12\theta^2 }\int_{-\infty}^{\infty} \frac{1}{\sqrt{2\pi}} e^{-\frac12 (x-\theta )^2 } \d x\\ &&&=e^{\frac12\theta^2 } \end{align*}
  2. \begin{align*} && M_{aX+bY} (\theta) &= \mathbb{E}[e^{\theta (aX+bY)}] \\ &&&= e^{\frac12(a\theta)^2} \cdot e^{\frac12(b\theta)^2} \\ &&&= e^{\frac12(a^2+b^2)\theta^2} \end{align*} Therefore we need \(a^2+b^2 = 1\)
  3. \(\,\) \begin{align*} && \E[Z^\theta] &= \E[e^{\mu \theta + \sigma \theta X}] \\ &&&= e^{\mu \theta}e^{\frac12 \sigma^2 \theta^2} \\ &&&=e^{\mu \theta + \frac12 \sigma^2 \theta^2} \\ \end{align*} \begin{align*} \mathbb{E}(Z) &= \mathbb{E}[Z^1] \\ &= e^{\mu + \frac12 \sigma^2} \\ \var[Z] &= \E[Z^2] - \left ( \E[Z] \right)^2 \\ &= e^{2 \mu+ 2\sigma^2} - e^{2\mu + \sigma^2} \\ &= e^{2\mu+\sigma^2} \left (e^{\sigma^2}-1 \right) \end{align*} [NB: This is the lognormal distribution]

1997 Paper 3 Q14
D: 1700.0 B: 1516.0

An industrial process produces rectangular plates of mean length \(\mu_{1}\) and mean breadth \(\mu_{2}\). The length and breadth vary independently with non-zero standard deviations \(\sigma_{1}\) and \(\sigma_{2}\) respectively. Find the means and standard deviations of the perimeter and of the area of the plates. Show that the perimeter and area are not independent.


Solution: Let \(L \sim N(\mu_1, \sigma_1^2)\), \(B \sim N(\mu_2, \sigma_2)^2\), so \begin{align*} && \mathbb{E}(\text{perimeter}) &= \E(2(L+B)) \\ &&&= 2\E[L]+2\E[B] \\ &&&= 2(\mu_1+\mu_2) \\ &&\var[\text{perimeter}] &= \E\left [ (2(L+B))^2 \right] - \left ( \E[2(L+B)] \right)^2 \\ &&&= 4\E[L^2+2LB+B^2] - 4(\mu_1+\mu_2)^2 \\ &&&= 4(\sigma_1^2+\mu_1^2+2\mu_1\mu_2+\sigma_2^2+\mu_2^2) - 4(\mu_1+\mu_2)^2\\ &&&= 4(\sigma_1^2+\sigma_2^2) \\ &&\text{sd}[\text{perimeter}] &= 2\sqrt{\sigma_1^2+\sigma_2^2} \\ \\ && \E[\text{area}] &= \E[LB] \\ &&&= \E[L]\E[B] \\ &&&= \mu_1\mu_2 \\ && \var[\text{area}] &= \E[(LB)^2] - \left (\E[LB] \right)^2 \\ &&&= \E[L^2]\E[B^2]-\mu_1^2\mu_2^2 \\ &&&= (\mu_1^2+\sigma_1^2)(\mu_2^2+\sigma_2^2) -\mu_1^2\mu_2^2 \\ &&&= \sigma_1^2\mu_2^2 + \sigma_2^2\mu_1^2 + \sigma_1^2\sigma_2^2\\ && \text{sd}(\text{area}) &= \sqrt{\sigma_1^2\mu_2^2 + \sigma_2^2\mu_1^2 + \sigma_1^2\sigma_2^2} \\ \\ && \E[\text{perimeter} \cdot \text{area}] &= \E[2(L+B)LB] \\ &&&= 2\E[L^2]\E[B] + 2\E[L]\E[B^2] \\ &&&= 2(\sigma_1^2+\mu_1^2)\mu_2 + 2(\sigma_2^2+\mu_2^2)\mu_1 \\ && \E[\text{perimeter}] \E[\text{area}] &= 2(\mu_1+\mu_2) \cdot \mu_1\mu_2 \end{align*} Since the latter does not depend on \(\sigma_i\) but the former does they cannot be equal in general, therefore they cannot be independent. [See also STEP 2006 Paper 3 Q14]

1996 Paper 1 Q13
D: 1500.0 B: 1527.6

I have a Penny Black stamp which I want to sell to my friend Jim, but we cannot agree a price. So I put the stamp under one of two cups, jumble them up, and let Jim guess which one it is under. If he guesses correctly, I add a third cup, jumble them up, and let Jim guess correctly, adding another cup each time. The price he pays for the stamp is \(\pounds N,\) where \(N\) is the number of cups present when Jim fails to guess correctly. Find \(\mathrm{P}(N=k)\). Show that \(\mathrm{E}(N)=\mathrm{e}\) and calculate \(\mathrm{Var}(N).\)


Solution: \begin{align*} && \mathbb{P}(N = k) &= \mathbb{P}(\text{guesses }k-1\text{ correctly then 1 wrong})\\ &&&= \frac12 \cdot \frac{1}{3} \cdots \frac{1}{k-1} \frac{k-1}{k} \\ &&&= \frac{k-1}{k!} \\ &&\mathbb{E}(N) &= \sum_{k=2}^\infty k \cdot \mathbb{P}(N=k) \\ &&&= \sum_{k=2}^{\infty} \frac{k(k-1)}{k!} \\ &&&= \sum_{k=0}^{\infty} \frac{1}{k!} = e \\ && \textrm{Var}(N) &= \mathbb{E}(N^2) - \mathbb{E}(N)^2 \\ && \mathbb{E}(N^2) &= \sum_{k=2}^{\infty} k^2 \mathbb{P}(N=k) \\ &&&= \sum_{k=2}^{\infty} \frac{k^2(k-1)}{k!} \\ &&&= \sum_{k=0}^{\infty} \frac{k+2}{k!} \\ &&&= \sum_{k=0}^{\infty} \frac{1}{k!} + 2 \sum_{k=0}^{\infty} \frac{1}{k!} = 3e \\ \Rightarrow && \textrm{Var}(N) &= 3e-e^2 \end{align*}

1996 Paper 2 Q14
D: 1600.0 B: 1500.0

The random variable \(X\) is uniformly distributed on \([0,1]\). A new random variable \(Y\) is defined by the rule \[ Y=\begin{cases} 1/4 & \mbox{ if }X\leqslant1/4,\\ X & \mbox{ if }1/4\leqslant X\leqslant3/4\\ 3/4 & \mbox{ if }X\geqslant3/4. \end{cases} \] Find \({\mathrm E}(Y^{n})\) for all integers \(n\geqslant 1\). Show that \({\mathrm E}(Y)={\mathrm E}(X)\) and that \[{\mathrm E}(X^{2})-{\mathrm E}(Y^{2})=\frac{1}{24}.\] By using the fact that \(4^{n}=(3+1)^{n}\), or otherwise, show that \({\mathrm E}(X^{n}) > {\mathrm E}(Y^{n})\) for \(n\geqslant 2\). Suppose that \(Y_{1}\), \(Y_{2}\), \dots are independent random variables each having the same distribution as \(Y\). Find, to a good approximation, \(K\) such that \[{\rm P}(Y_{1}+Y_{2}+\cdots+Y_{240000} < K)=3/4.\]


Solution: \begin{align*} && \E[Y^n] &= \frac14 \cdot \frac1{4^n} + \frac14 \cdot \frac{3^n}{4^n} + \frac12 \int_{1/4}^{3/4}2 y^n \d y \\ &&&= \frac{3^n+1}{4^{n+1}} + \left [ \frac{y^{n+1}}{n+1} \right]_{1/4}^{3/4} \\ &&&= \frac{3^n+1}{4^{n+1}} + \frac{3^{n+1}-1}{(n+1)4^{n+1}} \end{align*} \begin{align*} && \E[Y] &= \frac{3+1}{16} + \frac{9-1}{2 \cdot 16} \\ &&&= \frac{1}{4} + \frac{1}{4} = \frac12 = \E[X] \end{align*} \begin{align*} && \E[X^2] &= \int_0^1 x^2 \d x = \frac13 \\ && \E[Y^2] &= \frac{9+1}{64} + \frac{27-1}{3 \cdot 64} = \frac{56}{3 \cdot 64} = \frac{7}{24} \\ \Rightarrow && \E[X^2] - \E[Y^2] &= \frac13 - \frac{7}{24} = \frac{1}{24} \end{align*} \begin{align*} && \E[X^n] &= \frac{1}{n+1} \\ && \E[Y^n] &= \frac{1}{n+1} \frac{1}{4^{n+1}}\left ( (n+1)(3^n+1)+3^{n+1}-1 \right) \\ &&&= \frac{1}{n+1} \frac{1}{4^{n+1}}\left ( 3^{n+1} + (n+1)3^n +n \right) \\ \\ && (3+1)^{n+1} &= 3^{n+1} + (n+1)3^n + \cdots + (n+1) \cdot 3 + 1 \\ &&&> 3^{n+1} + (n+1)3^n + n + 1 \end{align*} if \(n \geq 2\) Notice that by the central limit theorem: \begin{align*} &&\frac{1}{240\,000} \sum_{i=1}^{240\,000} Y_i &\sim N \left ( \frac12, \frac{1}{24 \cdot 240\,000}\right) \\ \Rightarrow && \mathbb{P}\left (\frac{\frac{1}{240\,000} \sum_{i=1}^{240\,000} Y_i - \frac12}{\frac1{24} \frac{1}{100}} \leq \frac23 \right) &\approx 0.75 \\ \Rightarrow && \mathbb{P} \left ( \sum_i Y_i \leq 240\,000 \cdot \left ( \frac2{3} \frac1{2400}+\frac12 \right) \right ) & \approx 0.75 \\ \Rightarrow && K &= 120\,000 + 66 \\ &&&\approx 120\,066 \end{align*}