Problems

Filters
Clear Filters

40 problems found

2023 Paper 2 Q11
D: 1500.0 B: 1500.0

  1. \(X_1\) and \(X_2\) are both random variables which take values \(x_1, x_2, \ldots, x_n\), with probabilities \(a_1, a_2, \ldots, a_n\) and \(b_1, b_2, \ldots, b_n\) respectively. The value of random variable \(Y\) is defined to be that of \(X_1\) with probability \(p\) and that of \(X_2\) with probability \(q = 1-p\). If \(X_1\) has mean \(\mu_1\) and variance \(\sigma_1^2\), and \(X_2\) has mean \(\mu_2\) and variance \(\sigma_2^2\), find the mean of \(Y\) and show that the variance of \(Y\) is \(p\sigma_1^2 + q\sigma_2^2 + pq(\mu_1 - \mu_2)^2\).
  2. To find the value of random variable \(B\), a fair coin is tossed and a fair six-sided die is rolled. If the coin shows heads, then \(B = 1\) if the die shows a six and \(B = 0\) otherwise; if the coin shows tails, then \(B = 1\) if the die does not show a six and \(B = 0\) if it does. The value of \(Z_1\) is the sum of \(n\) independent values of \(B\), where \(n\) is large. Show that \(Z_1\) is a Binomial random variable with probability of success \(\frac{1}{2}\). Using a Normal approximation, show that the probability that \(Z_1\) is within \(10\%\) of its mean tends to \(1\) as \(n \longrightarrow \infty\).
  3. To find the value of random variable \(Z_2\), a fair coin is tossed and \(n\) fair six-sided dice are rolled, where \(n\) is large. If the coin shows heads, then the value of \(Z_2\) is the number of dice showing a six; if the coin shows tails, then the value of \(Z_2\) is the number of dice not showing a six. Use part (i) to write down the mean and variance of \(Z_2\). Explain why a Normal distribution with this mean and variance will not be a good approximation to the distribution of \(Z_2\). Show that the probability that \(Z_2\) is within \(10\%\) of its mean tends to \(0\) as \(n \longrightarrow \infty\).

2021 Paper 3 Q12
D: 1500.0 B: 1500.0

  1. In a game, each member of a team of \(n\) players rolls a fair six-sided die. The total score of the team is the number of pairs of players rolling the same number. For example, if \(7\) players roll \(3, 3, 3, 3, 6, 6, 2\) the total score is \(7\), as six different pairs of players both score \(3\) and one pair of players both score \(6\). Let \(X_{ij}\), for \(1 \leqslant i < j \leqslant n\), be the random variable that takes the value \(1\) if players \(i\) and \(j\) roll the same number and the value \(0\) otherwise. Show that \(X_{12}\) is independent of \(X_{23}\). Hence find the mean and variance of the team's total score.
  2. Show that, if \(Y_i\), for \(1 \leqslant i \leqslant m\), are random variables with mean zero, then \[ \mathrm{Var}(Y_1 + Y_2 + \cdots + Y_m) = \sum_{i=1}^{m} \mathrm{E}(Y_i^2) + 2\sum_{i=1}^{m-1}\sum_{j=i+1}^{m} \mathrm{E}(Y_i Y_j). \]
  3. In a different game, each member of a team of \(n\) players rolls a fair six-sided die. The total score of the team is the number of pairs of players rolling the same even number minus the number of pairs of players rolling the same odd number. For example, if \(7\) players roll \(3, 3, 3, 3, 6, 6, 2\) the total score is \(-5\). Let \(Z_{ij}\), for \(1 \leqslant i < j \leqslant n\), be the random variable that takes the value \(1\) if players \(i\) and \(j\) roll the same even number, the value \(-1\) if players \(i\) and \(j\) roll the same odd number and the value \(0\) otherwise. Show that \(Z_{12}\) is not independent of \(Z_{23}\). Find the mean of the team's total score and show that the variance of the team's total score is \(\dfrac{1}{36}n(n^2 - 1)\).


Solution:

  1. First note that \(\mathbb{P}(X_{ij} = 1) = \frac16\) since it doesn't matter what \(i\) rolls, it only matters that \(j\) rolls the same thing, which happens \(1/6\) of the time. \begin{align*} && \mathbb{P}(X_{12} = 1, X_{23} = 1) &= \mathbb{P}(1, 2\text{ and }3\text{ all roll the same})\\ &&&= \frac{6}{6^3}= \frac1{6^2} \\ &&&= \mathbb{P}(X_{12} = 1)\mathbb{P}(X_{23} = 1) \\ && \mathbb{P}(X_{12} = 1, X_{23} = 0) &= \mathbb{P}(1, 2\text{ roll the same and }3\text{ rolls different}) \\ &&&= \frac{6 \cdot 1 \cdot 5}{6^3} = \frac{5}{6^2} \\ &&&= \mathbb{P}(X_{12} = 1)\mathbb{P}(X_{23} = 0) \\ && \mathbb{P}(X_{12} = 0, X_{23} = 0) &= \mathbb{P}(2, 3 \text{ roll different to} 2)\\ &&&= \frac{6 \cdot 5 \cdot 5}{6^3}= \frac{5^2}{6^2} \\ &&&= \mathbb{P}(X_{12} = 0)\mathbb{P}(X_{23} = 0) \end{align*} Therefore they are independent (the final case is clear by symmetry from case 2). Note that the score is \(S = \sum_{i \neq j} X_{ij}\) so \begin{align*} && \E[S] &= \E \left [ \sum_{i \neq j} X_{ij} \right] \\ &&&= \sum_{i \neq j} \E \left [ X_{ij} \right] \\ &&&= \sum_{i \neq j} \frac16 \\ &&&= \binom{n}{2} \frac16 = \frac{n(n-1)}{12} \\ \\ && \var[S] &= \var \left [ \sum_{i \neq j} X_{ij} \right] \\ &&& \sum_{i \neq j} \var \left [X_{ij} \right] \tag{pairwise ind.} \\ &&&= \binom{n}{2} \frac{5}{36} = \frac{5n(n-1)}{72} \end{align*}
  2. Note that \(\mathbb{P}(Z_{ij} = 1)=\mathbb{P}(Z_{ij} = -1) = \frac{3}{6^2} = \frac{1}{12}\) but that \(\mathbb{P}(Z_{12} = 1, Z_{23} = -1) = 0\). Notice that \(Z_{12}Z_{23}\) is either \(1\) or \(0\) (since \(2\) can't be both odd and even). \(\mathbb{P}(Z_{12}Z_{23} = 1) = \frac{1}{36}\). Notice that \(Z_{ij}, Z_{kl}\) are independent if \(i \neq j \neq k \neq l\) and so \begin{align*} && \E[T] &= \E \left [ \sum_{i \neq j} Z_{ij} \right] \\ &&&= \sum_{i \neq j}\E \left [ Z_{ij} \right] \\ &&&= 0 \\ \\ && \E[T^2] &= \E \left [ \left ( \sum_{i \neq j} Z_{ij} \right)^2 \right] \\ &&&= \E \left [ \sum_{i \neq j} Z_{ij}^2 + \sum_{i \neq j \neq k} Z_{ij}Z_{jk} + \sum_{i \neq j \neq k \neq l} Z_{ij}Z_{kl}\right] \\ &&&= \binom{n}{2} \frac{1}{6} + 2\frac{n(n-1)(n-2)}{2} \frac{1}{36} + 0 \\ &&&= \frac{n(n-1)}{12} + \frac{n(n-1)(n-2)}{6} \\ &&&= \frac{n(n-1)[3 + (n-2)]}{36} \\ &&&= \frac{n(n^2-1)}{36} \end{align*}

2020 Paper 3 Q11
D: 1500.0 B: 1500.0

The continuous random variable \(X\) is uniformly distributed on \([a,b]\) where \(0 < a < b\).

  1. Let \(\mathrm{f}\) be a function defined for all \(x \in [a,b]\)
    • with \(\mathrm{f}(a) = b\) and \(\mathrm{f}(b) = a\),
    • which is strictly decreasing on \([a,b]\),
    • for which \(\mathrm{f}(x) = \mathrm{f}^{-1}(x)\) for all \(x \in [a,b]\).
    The random variable \(Y\) is defined by \(Y = \mathrm{f}(X)\). Show that \[ \mathrm{P}(Y \leqslant y) = \frac{b - \mathrm{f}(y)}{b - a} \quad \text{for } y \in [a,b]. \] Find the probability density function for \(Y\) and hence show that \[ \mathrm{E}(Y^2) = -ab + \int_a^b \frac{2x\,\mathrm{f}(x)}{b-a} \; \mathrm{d}x. \]
  2. The random variable \(Z\) is defined by \(\dfrac{1}{Z} + \dfrac{1}{X} = \dfrac{1}{c}\) where \(\dfrac{1}{c} = \dfrac{1}{a} + \dfrac{1}{b}\). By finding the variance of \(Z\), show that \[ \ln\left(\frac{b-c}{a-c}\right) < \frac{b-a}{c}. \]

2019 Paper 2 Q12
D: 1500.0 B: 1500.0

The random variable \(X\) has the probability density function on the interval \([0, 1]\): $$f(x) = \begin{cases} nx^{n-1} & 0 \leq x \leq 1, \\ 0 & \text{elsewhere}, \end{cases}$$ where \(n\) is an integer greater than 1.

  1. Let \(\mu = E(X)\). Find an expression for \(\mu\) in terms of \(n\), and show that the variance, \(\sigma^2\), of \(X\) is given by $$\sigma^2 = \frac{n}{(n + 1)^2(n + 2)}.$$
  2. In the case \(n = 2\), show without using decimal approximations that the interquartile range is less than \(2\sigma\).
  3. Write down the first three terms and the \((k + 1)\)th term (where \(0 \leq k \leq n\)) of the binomial expansion of \((1 + x)^n\) in ascending powers of \(x\). By setting \(x = \frac{1}{n}\), show that \(\mu\) is less than the median and greater than the lower quartile. Note: You may assume that $$1 + \frac{1}{1!} + \frac{1}{2!} + \frac{1}{3!} + \cdots < 4.$$


Solution:

  1. \(\,\) \begin{align*} && \mu &= \E[X] \\ &&&= \int_0^1 x f(x) \d x \\ &&&= \int_0^1 nx^n \d x \\ &&&= \frac{n}{n+1} \\ \\ && \var[X] &= \sigma^2 \\ &&&= \E[X^2] - \mu^2 \\ &&&= \int_0^1 x^2 f(x) \d x - \mu^2 \\ &&&= \int_0^1 nx^{n+1} \d x - \mu^2 \\ &&&= \frac{n}{n+2} - \frac{n^2}{(n+1)^2} \\ &&&= \frac{n(n+1)^2 - n^2(n+2)}{(n+1)^2(n+2)} \\ &&&= \frac{n}{(n+1)^2(n+2)} \end{align*}
  2. \(\,\) \begin{align*} && \frac14 &= \int_0^{Q_1} 2x \d x \\ &&&= Q_1^2 \\ \Rightarrow && Q_1 &= \frac12 \\ && \frac34 &= \int_0^{Q_3} 2x \d x \\ &&&= Q_3^2 \\ \Rightarrow && Q_3 &= \frac{\sqrt{3}}2 \\ \\ \Rightarrow && IQR &= Q_3 - Q_1 = \frac{\sqrt{3}-1}{2} \\ && 2 \sigma &= 2\sqrt{\frac{2}{3^2 \cdot 4}} \\ &&&= \frac{\sqrt{2}}{3} \\ \\ && 2\sigma - IRQ &= \frac{\sqrt{2}}{3} - \frac{\sqrt{3}-1}{2} \\ &&&= \frac{2\sqrt{2}-3\sqrt{3}+3}{6} \\ && (3+2\sqrt{2})^2 &= 17+12\sqrt{2} > 29 \\ && (3\sqrt{3})^2 &= 27 \end{align*} Therefore \(2\sigma > IQR\)
  3. \[ (1+x)^n = 1 + nx + \frac{n(n-1)}2 x^2 + \cdots + \binom{n}{k} x^k+ \cdots \] \begin{align*} && Q_1^{-n} &= 4 \\ && Q_2^{-n} &= 2\\ && \mu &=\frac{n}{n+1} \\ \Rightarrow && \mu^{-n} &= \left (1 + \frac1n \right)^n\\ &&&\geq 1 + n \frac1n + \cdots > 2 \\ \Rightarrow && \mu &< Q_2 \\ \\ && \mu^{-n} &= \left (1 + \frac1n \right)^n\\ &&&= 1 + n \frac1n + \frac{n(n-1)}{2!} \frac{1}{n^2} + \cdots + \frac{n(n-1) \cdots (n-k+1)}{k!} \frac{1}{n^k} + \cdots \\ &&&= 1 + 1 + \left (1 - \frac1n \right ) \frac1{2!} + \cdots + \left (1 - \frac1n \right)\cdot\left (1 - \frac2n \right) \cdots \left (1 - \frac{k-1}n \right) \frac{1}{k!} + \cdots \\ &&&< 1 + 1 + \frac1{2!} + \cdots + \frac1{k!} \\ &&&< 4 \\ \Rightarrow && \mu &> Q_1 \end{align*}

2018 Paper 1 Q11
D: 1500.0 B: 1513.7

A bag contains three coins. The probabilities of their showing heads when tossed are \(p_1\), \(p_2\) and \(p_3\).

  1. A coin is taken at random from the bag and tossed. What is the probability that it shows a head?
  2. A coin is taken at random from the bag (containing three coins) and tossed; the coin is returned to the bag and again a coin is taken at random from the bag and tossed. Let \(N_1\) be the random variable whose value is the number of heads shown on the two tosses. Find the expectation of \(N_1\) in terms of \(p\), where \(p = \frac{1}{3}(p_1+p_2+p_3)\,\), and show that \(\var(N_1) =2p(1-p)\,\).
  3. Two of the coins are taken at random from the bag (containing three coins) and tossed. Let \(N_2\) be the random variable whose value is the number of heads showing on the two coins. Find \(\E(N_2)\) and \(\var(N_2)\).
  4. Show that \(\var(N_2)\le \var(N_1)\), with equality if and only if \(p_1=p_2=p_3\,\).


Solution:

  1. \(\mathbb{P}(\text{head}) = \mathbb{P}(\text{head}|1)\mathbb{P}(\text{coin 1}) + \mathbb{P}(\text{head}|2)\mathbb{P}(\text{coin 2})+\mathbb{P}(\text{head}|3)\mathbb{P}(\text{coin 3}) = \frac13(p_1+p_2+p_3)\)
  2. \(N_1 = X_1 + X_2\) where \(X_i \sim Bernoulli(p)\), therefore \(\mathbb{E}(N_1) = 2p\) and \(\textrm{Var}(N_1) = \textrm{Var}(X_1)+ \textrm{Var}(X_2) = p(1-p)+p(1-p) = 2p(1-p)\)
  3. Let \(Y_i\) be the indicator for the \(i\)th coin is heads. Then \(\mathbb{E}(Y_i) = p\) and so \(\mathbb{E}(N_2) = 2p\). \begin{align*} && \textrm{Var}(N_2) &= \mathbb{E}(N_2^2) - [\mathbb{E}(N_2)]^2\\ &&&= 2^2 \cdot \left (\frac13 \left (p_1p_2+p_2p_3+p_3p_1 \right) \right) + 1 \cdot \left (\frac13 \left (p_1 (1-p_2) + (1-p_1)p_2 + p_2(1-p_3) +(1-p_2)p_3 + p_3(1-p_1) + (1-p_3)p_1 \right) \right) - [\mathbb{E}(N_2)]^2 \\ &&&= \frac43\left (p_1p_2+p_2p_3+p_3p_1 \right) + \frac13 \left ( 2(p_1+p_2+p_3) - 2(p_1p_2+p_2p_3+p_3p_1)\right)-[\mathbb{E}(N_2)]^2 \\ &&&= \frac23\left (p_1p_2+p_2p_3+p_3p_1 \right) + \frac23 \left ( p_1+p_2+p_3 \right)-[\mathbb{E}(N_2)]^2\\ &&&= \frac23\left (p_1p_2+p_2p_3+p_3p_1 \right) + \frac23 \left ( p_1+p_2+p_3 \right)-\left[\frac23(p_1+p_2+p_3)\right]^2\\ &&&= \frac23\left (p_1p_2+p_2p_3+p_3p_1 \right) +2p(1-2p)\\ \end{align*}
  4. \(\,\) \begin{align*} && \textrm{Var}(N_1) - \textrm{Var}(N_2) &= 2p(1-p) - \left (\frac23\left (p_1p_2+p_2p_3+p_3p_1 \right) +2p(1-2p) \right) \\ &&&= 2p^2-\frac23\left (p_1p_2+p_2p_3+p_3p_1 \right) \\ &&&= \frac23 \left ( \frac13(p_1+p_2+p_3)^2 -\left (p_1p_2+p_2p_3+p_3p_1 \right)\right)\\ &&&= \frac29 \left (p_1^2+p_2^2+p_3^2 -(p_1p_2+p_2p_3+p_3p_1) \right)\\ &&&= \frac19 \left ((p_1-p_2)^2+(p_2-p_3)^2+(p_3-p_1)^2 \right) &\geq 0 \end{align*} with equality iff \(p_1 = p_2 = p_3\)

2017 Paper 3 Q13
D: 1700.0 B: 1500.0

The random variable \(X\) has mean \(\mu\) and variance \(\sigma^2\), and the function \({\rm V}\) is defined, for \(-\infty < x < \infty\), by \[ {\rm V}(x) = \E \big( (X-x)^2\big) . \] Express \({\rm V}(x)\) in terms of \(x\), \( \mu\) and \(\sigma\). The random variable \(Y\) is defined by \(Y={\rm V}(X)\). Show that \[ \E(Y) = 2 \sigma^2 %\text{ \ \ and \ \ } %\Var(Y) = \E(X-\mu)^4 -\sigma^4 . \tag{\(*\)} \] Now suppose that \(X\) is uniformly distributed on the interval \(0\le x \le1\,\). Find \({\rm V}(x)\,\). Find also the probability density function of \(Y\!\) and use it to verify that \((*)\) holds in this case.


Solution: \begin{align*} {\rm V}(x) &= \E \big( (X-x)^2\big) \\ &= \E \l X^2 - 2xX + x^2\r \\ &= \E [ X^2 ]- 2x\E[X] + x^2 \\ &= \sigma^2+\mu^2 - 2x\mu + x^2 \\ &= \sigma^2 + (\mu - x)^2 \end{align*} \begin{align*} \E[Y] &= \E[\sigma^2 + (\mu - X)^2] \\ &= \sigma^2 + \E[(\mu - X)^2]\\ &= \sigma^2 + \sigma^2 \\ &= 2\sigma^2 \end{align*} If \(X \sim U(0,1)\) then \(V(x) = \frac{1}{12} + (\frac12 - x)^2\). \begin{align*} \P(Y \leq y) &= \P(\frac1{12} + (\frac12 - X)^2 \leq y) \\ &= \P((\frac12 -X)^2 \leq y - \frac1{12}) \\ &= \P(|\frac12 -X| \leq \sqrt{y - \frac1{12}}) \\ &= \begin{cases} 1 & \text{if } y - \frac1{12} > \frac14 \\ 2 \sqrt{y - \frac1{12}} & \text{if } \frac14 > y - \frac1{12} > 0 \\ \end{cases} \\ &= \begin{cases} 1 & \text{if } y> \frac13 \\ \sqrt{4y - \frac1{3}} & \text{if } \frac13 > y > \frac1{12} \\ \end{cases} \end{align*} Therefore $f_Y(y) = \begin{cases} \frac{2}{\sqrt{4y-\frac{1}{3}}} & \text{if } \frac1{12} < y < \frac13 \\ 0 & \text{otherwise} \end{cases}$ \begin{align*} \E[Y] &= \int_{1/12}^{1/3} \frac{2x}{\sqrt{4x-\frac13}} \, dx \\ &= 2\int_{u = 0}^{u=1} \frac{\frac{1}{4}u +\frac1{12}}{\sqrt{u}} \,\frac{1}{4} du \tag{\(u = 4x - \frac13, \frac{du}{dx} = 4\)}\\ &= \frac{1}{2 \cdot 12}\int_{u = 0}^{u=1} 3\sqrt{u} +\frac{1}{\sqrt{u}} \, du \\ &= \frac{1}{2 \cdot 12} \left [2 u^{3/2} + 2u^{1/2} \right ]_0^1 \\ &= \frac{1}{2 \cdot 12} \cdot 4 \\ &= \frac{2}{12} \end{align*} as required

2015 Paper 2 Q13
D: 1600.0 B: 1516.0

The maximum height \(X\) of flood water each year on a certain river is a random variable with probability density function \(\f\) given by \[ \f(x) = \begin{cases} \lambda \e^{-\lambda x} & \text{for \(x\ge0\)}\,, \\ 0 & \text{otherwise,} \end{cases} \] where \(\lambda\) is a positive constant. It costs \(ky\) pounds each year to prepare for flood water of height \(y\) or less, where \(k\) is a positive constant and \(y\ge0\). If \(X \le y\) no further costs are incurred but if \(X> y\) the additional cost of flood damage is \(a(X - y )\) pounds where \(a\) is a positive constant.

  1. Let \(C\) be the total cost of dealing with the floods in the year. Show that the expectation of \(C\) is given by \[\mathrm{E}(C)=ky+\frac{a}{\lambda}\mathrm{e}^{-\lambda y} \, . \] How should \(y\) be chosen in order to minimise \(\mathrm{E}(C)\), in the different cases that arise according to the value of \(a/k\)?
  2. Find the variance of \(C\), and show that the more that is spent on preparing for flood water in advance the smaller this variance.


Solution:

  1. \(\,\) \begin{align*} && \mathbb{E}(C) &= \int_0^\infty \text{cost}(x) f(x) \d x \\ &&&= ky + \int_y^{\infty} a(x-y) \lambda e^{-\lambda x} \d x\\ &&&= ky + \int_0^{\infty} a u \lambda e^{-\lambda u -\lambda y} \d x \\ &&&= ky + ae^{-\lambda y} \left( \left [ -ue^{-\lambda u} \right]_0^\infty -\int_0^\infty e^{-\lambda u} \d u\right) \\ &&&= ky + \frac{a}{\lambda}e^{-\lambda y} \\ \\ && \frac{\d \mathbb{E}(C)}{\d y} &= k - ae^{-\lambda y} \\ \Rightarrow && y &= \frac{1}{\lambda}\ln \left ( \frac{a}{k} \right) \end{align*} Since \(\mathbb{E}(C)\) is clearly increasing when \(y\) is very large, the optimal value will be \(\frac{1}{\lambda}\ln \left ( \frac{a}{k} \right)\), if \(\frac{a}{k} > 1\), otherwise you should spend nothing on flood defenses.
  2. \begin{align*} && \mathbb{E}(C^2) &= \int_0^{\infty} \text{cost}(x)^2 f(x) \d x \\ &&&= \int_0^{\infty}(ky + a(x-y)\mathbb{1}_{x > y})^2 f(x) \d x \\ &&&= k^2y^2 + \int_y^{\infty}2kya(x-y)f(x)\d x + \int_y^{\infty}a^2 (x-y)^2 f(x) \d x \\ &&&= k^2y^2 + \frac{2kya}{\lambda}e^{- \lambda y}+a^2e^{-\lambda y}\int_{u=0}^\infty u^2 \lambda e^{-\lambda u} \d u \\ &&&= k^2y^2 + \frac{2kya}{\lambda}e^{-\lambda y}+a^2e^{-\lambda y}(\textrm{Var}(Exp(\lambda)) + \mathbb{E}(Exp(\lambda))^2\\ &&&= k^2y^2 + \frac{2kya}{\lambda}e^{-\lambda y} + a^2e^{-\lambda y} \frac{2}{\lambda^2} \\ && \textrm{Var}(C) &= k^2y^2 + \frac{2kya}{\lambda}e^{-\lambda y} + a^2e^{-\lambda y} \frac{2}{\lambda^2} - \left ( ky + \frac{a}{\lambda} e^{-\lambda y}\right)^2 \\ &&&= a^2e^{-\lambda y} \frac{2}{\lambda^2} - a^2 e^{-2\lambda y}\frac{1}{\lambda^2} \\ &&&= \frac{a^2}{\lambda^2} e^{-\lambda y}\left (2 - e^{-\lambda y} \right) \\ \\ && \frac{\d \textrm{Var}(C)}{\d y} &= \frac{a^2}{\lambda^2} \left (-2\lambda e^{-\lambda y} +2\lambda e^{-2\lambda y} \right) \\ &&&= \frac{2a^2}{\lambda} e^{-\lambda y}\left (e^{-\lambda y}-1 \right) \leq 0 \end{align*} so \(\textrm{Var}(C)\) is decreasing in \(y\).

2013 Paper 2 Q12
D: 1600.0 B: 1484.0

The random variable \(U\) has a Poisson distribution with parameter \(\lambda\). The random variables \(X\) and \(Y\) are defined as follows. \begin{align*} X&= \begin{cases} U & \text{ if \(U\) is 1, 3, 5, 7, \(\ldots\,\)} \\ 0 & \text{ otherwise} \end{cases} \\ Y&= \begin{cases} U & \text{ if \(U\) is 2, 4, 6, 8, \(\ldots\,\) } \\ 0 & \text{ otherwise} \end{cases} \end{align*}

  1. Find \(\E(X)\) and \(\E(Y)\) in terms of \(\lambda\), \(\alpha\) and \(\beta\), where \[ \alpha = 1+\frac{\lambda^2}{2!}+\frac{\lambda^4}{4!} +\cdots\, \text{ \ \ and \ \ } \beta = \frac{\lambda}{1!} + \frac{\lambda^3}{3!} + \frac{\lambda^5}{5!} +\cdots\,. \]
  2. Show that \[ \var(X) = \frac{\lambda\alpha+\lambda^2\beta}{\alpha+\beta} - \frac{\lambda^2\alpha^2}{(\alpha+\beta)^2} \] and obtain the corresponding expression for \(\var(Y)\). Are there any non-zero values of \(\lambda\) for which \( \var(X) + \var(Y) = \var(X+Y)\,\)?


Solution:

  1. \begin{align*} \mathbb{E}(X) &= \sum_{r=1}^\infty r \mathbb{P}(X = r) \\ &= \sum_{j=1}^{\infty} (2j-1)\mathbb{P}(U=2j-1) \\ &= \sum_{j=1}^{\infty}(2j-1) \frac{e^{-\lambda} \lambda^{2j-1}}{(2j-1)!} \\ &= \sum_{j=1}^{\infty} e^{-\lambda} \frac{\lambda^{2j-1}}{(2j-2)!} \\ &= \lambda e^{-\lambda} \sum_{j=1}^{\infty} \frac{\lambda^{2j-2}}{(2j-2)!} \\ &= \lambda e^{-\lambda} \alpha \end{align*} Since \(\mathbb{E}(X+Y) = \lambda, \mathbb{E}(Y) = \lambda(1-e^{-\lambda}\alpha) = \lambda(e^{-\lambda}(\alpha+\beta) - e^{-\lambda}\alpha) = \lambda e^{-\lambda} \beta\). Alternatively, as \(\beta + \alpha = e^{\lambda}\), \(\mathbb{E}(X) = \frac{\lambda \alpha}{\alpha+\beta}, \mathbb{E}(Y) = \frac{\lambda \beta}{\alpha+\beta}\)
  2. \begin{align*} \textrm{Var}(X) &= \mathbb{E}(X^2) - [\mathbb{E}(X) ]^2 \\ &= \sum_{odd} r^2 \mathbb{P}(U = r) - \left [ \mathbb{E}(X) \right]^2 \\ &= \sum_{odd} (r(r-1)+r)\frac{e^{-\lambda}\lambda^r}{r!} - \frac{\lambda^2 \alpha^2}{(\alpha+\beta)^2} \\ &= \sum_{odd} \frac{e^{-\lambda}\lambda^r}{(r-2)!}+\sum_{odd} \frac{e^{-\lambda}\lambda^r}{(r-1)!} - \frac{\lambda^2 \alpha^2}{(\alpha+\beta)^2} \\ &= e^{-\lambda}\lambda^2 \beta + e^{-\lambda}\lambda \alpha - \frac{\lambda^2 \alpha^2}{(\alpha+\beta)^2} \\ &= \frac{\lambda \alpha + \lambda^2 \beta}{\alpha+\beta}- \frac{\lambda^2 \alpha^2}{(\alpha+\beta)^2} \end{align*} Similarly, \begin{align*} \textrm{Var}(Y) &= \mathbb{E}(Y^2) - [\mathbb{E}(Y) ]^2 \\ &= \sum_{even} r^2 \mathbb{P}(U = r) - \left [ \mathbb{E}(Y) \right]^2 \\ &= \sum_{even} (r(r-1)+r)\frac{e^{-\lambda}\lambda^r}{r!} - \frac{\lambda^2 \beta^2}{(\alpha+\beta)^2} \\ &= e^{-\lambda}\lambda^2\alpha + e^{-\lambda}\lambda \beta - \frac{\lambda^2 \beta^2}{(\alpha+\beta)^2} \\ &= \frac{\lambda \beta + \lambda^2 \alpha}{\alpha+\beta}- \frac{\lambda^2 \beta^2}{(\alpha+\beta)^2} \end{align*} Since \(\textrm{Var}(X+Y) = \textrm{Var}(U) = \lambda\), we are interested in solving: \begin{align*} \lambda &= \frac{\lambda \alpha + \lambda^2 \beta}{\alpha+\beta}- \frac{\lambda^2 \alpha^2}{(\alpha+\beta)^2} + \frac{\lambda \beta + \lambda^2 \alpha}{\alpha+\beta}- \frac{\lambda^2 \beta^2}{(\alpha+\beta)^2} \\ &= \frac{\lambda(\alpha+\beta) + \lambda^2(\alpha+\beta)}{\alpha+\beta} - \frac{\lambda^2(\alpha^2+\beta^2)}{(\alpha+\beta)^2} \\ &= \lambda + \lambda^2 \frac{(\alpha+\beta)^2 - (\alpha^2+\beta^2)}{(\alpha+\beta)^2} \\ &= \lambda + \lambda^2 \frac{2\alpha\beta}{(\alpha+\beta)^2} \end{align*} which is clearly not possible if \(\lambda \neq 0\)

2013 Paper 3 Q12
D: 1700.0 B: 1500.0

A list consists only of letters \(A\) and \(B\) arranged in a row. In the list, there are \(a\) letter \(A\)s and \(b\) letter \(B\)s, where \(a\ge2\) and \(b\ge2\), and \(a+b=n\). Each possible ordering of the letters is equally probable. The random variable \(X_1\) is defined by \[ X_1 = \begin{cases} 1 & \text{if the first letter in the row is \(A\)};\\ 0 & \text{otherwise.} \end{cases} \] The random variables \(X_k\) (\(2 \le k \le n\)) are defined by \[ X_k = \begin{cases} 1 & \text{if the \((k-1)\)th letter is \(B\) and the \(k\)th is \(A\)};\\ 0 & \text{otherwise.} \end{cases} \] The random variable \(S\) is defined by \(S = \sum\limits_ {i=1}^n X_i\,\).

  1. Find expressions for \(\E(X_i)\), distinguishing between the cases \(i=1\) and \(i\ne1\), and show that \(\E(S)= \dfrac{a(b+1)}n\,\).
  2. Show that:
    1. for \(j\ge3\), \(\E(X_1X_j) = \dfrac{a(a-1)b}{n(n-1)(n-2)}\,\);
    2. \[ \sum\limits_{i=2}^{n-2} \bigg( \sum\limits_{j=i+2}^n \E(X_iX_j)\bigg) = \dfrac{a(a-1)b(b-1)}{2n(n-1)}\,\]
    3. \(\var(S) = \dfrac {a(a-1)b(b+1)}{n^2(n-1)}\,\).


Solution:

  1. Notice that \(\E[X_1] = \frac{a}{n}\) and consider \(\E[X_i]\) with \(i > 1\). the probability that this is \(1\) is \(\frac{b}{n} \cdot \frac{a}{n-1}\). So \begin{align*} && \E[S] &= \E[X_1] + \sum_{i=2}^n \E[X_i] \\ &&&= \frac{a}{n} + (n-1) \frac{ab}{n(n-1)} \\ &&&= \frac{a(b+1)}{n} \end{align*}
    1. The probability \(X_1X_j = 1\) is \(\frac{a}{n} \cdot \frac{b}{n-1} \cdot \frac{a-1}{n-2} = \frac{a(a-1)b}{n(n-1)(n-2)}\) since there is nothing special about the order, and the first is an \(A\) with probability \(\frac{a}{n}\) and given this occurs there are now \(a-1\) \(A\) and \(n-1\) letters left etc... Therefore \(\E[X_1X_j] = \frac{a(a-1)b}{n(n-1)(n-2)}\)
    2. \(\E[X_iX_j]\) when the pairs don't overlap is \(\frac{a}{n} \frac{b}{n-1} \frac{a-1}{n-2} \frac{b-1}{n-3}\), and so \begin{align*} && \sum\limits_{i=2}^{n-2} \bigg( \sum\limits_{j=i+2}^n \E(X_iX_j)\bigg) &= \sum\limits_{i=2}^{n-2} \bigg( \sum\limits_{j=i+2}^n \frac{a(a-1)b(b-1)}{n(n-1)(n-2)(n-3)}\bigg) \\ &&&= \frac{a(a-1)b(b-1)}{n(n-1)(n-2)(n-3)}\sum\limits_{i=2}^{n-2} \bigg( \sum\limits_{j=i+2}^n 1\bigg) \\ &&&= \frac{a(a-1)b(b-1)}{n(n-1)(n-2)(n-3)}\sum\limits_{i=2}^{n-2} (n-(i+1)) \\ &&&= \frac{a(a-1)b(b-1)}{n(n-1)(n-2)(n-3)} \left ((n-1)(n-3)-\frac{(n-2)(n-1)}{2}+1 \right) \\ &&&= \frac{a(a-1)b(b-1)}{n(n-1)(n-2)(n-3)} \left ( \frac{2n^2-8n-6-n^2+3n-2+2}{2}\right) \\ &&&= \frac{a(a-1)b(b-1)}{n(n-1)(n-2)(n-3)} \left ( \frac{n^2-5n-6}{2}\right) \\ &&&= \frac{a(a-1)b(b-1)}{2n(n-1)} \end{align*}
    3. We also need to consider the other cross terms. \(X_iX_{i+1}=0\). (Since \(X_i = 1\) means the \(i\)th letter is \(A\) and \(X_{i+1} = 1\) means the \(i\)th letter is \(B\)). It's the same story for \(X_1X_2\), and so all the cross terms are accounted for. Therefore \begin{align*} && \E[S^2] &= \E \left [\sum X_i^2 + 2\sum_{i \neq j} X_i X_j \right] \\ &&&= \frac{a(b+1)}{n} +2(n-2)\frac{a(a-1)b}{n(n-1)(n-2)}+ 2 \frac{a(a-1)b(b-1)}{2n(n-1)} \\ &&&= \frac{a(b+1)}{n} +\frac{2a(a-1)b}{n(n-1)} + \frac{a(a-1)b(b-1)}{n(n-1)} \\ &&&= \frac{a(b+1)}{n} +\frac{a(a-1)b(b+1)}{n(n-1)} \\ && \var[S] &= \E[S^2] - \left ( \E[S] \right)^2 \\ &&&= \frac{a(b+1)}{n} + \frac{a(a-1)b(b+1)}{n(n-1)} - \frac{a^2(b+1)^2}{n^2} \\ &&&= \frac{a(b+1) \left (n(n-1) + (a-1)b n -a(b+1)(n-1) \right)}{n^2(n-1)} \\ &&&= \frac{a(b+1) \left ( (n-a)(n-b-1) \right)}{n^2(n-1)} \\ &&&= \frac{a(b+1) \left ( b(a-1) \right)}{n^2(n-1)} \\ \end{align*}

2012 Paper 2 Q13
D: 1600.0 B: 1516.0

In this question, you may assume that \(\displaystyle \int_0^\infty \!\!\! \e^{-x^2/2} \d x = \sqrt{\tfrac12 \pi}\,\). The number of supermarkets situated in any given region can be modelled by a Poisson random variable, where the mean is \(k\) times the area of the given region. Find the probability that there are no supermarkets within a circle of radius \(y\). The random variable \(Y\) denotes the distance between a randomly chosen point in the region and the nearest supermarket. Write down \(\P(Y < y)\) and hence show that the probability density function of \(Y\) is \(\displaystyle 2\pi y k \e^{-\pi k y^2}\) for \(y\ge0\). Find \(\E(Y)\) and show that \(\var(Y) = \dfrac{4-\pi}{4\pi k}\).


Solution: A circle radius \(y\) has a number of supermarkets \(X\) where \(X \sim Po(k \pi y^2)\). \[ \mathbb{P}(X = 0) = e^{-k\pi y^2} \frac{1}{0!} = e^{-k\pi y^2} \] The probability \(\mathbb{P}(Y < y) = 1-\mathbb{P}(Y \geq y) = 1-e^{-k\pi y^2}\), and in particular \(f_Y(y) = 2k\pi y e^{-k\pi y^2}\) (by differentiating). \begin{align*} && \mathbb{E}(Y) &= \int_0^\infty yf_Y(y) \d y \\ &&&= \int_0^\infty 2\pi y^2 k e^{-\pi k y^2} \d y \\ \sigma^2 = \frac{1}{2k\pi}:&&&= \pi k \sqrt{2 \pi}\sigma \int_{-\infty}^\infty \frac{1}{\sqrt{2 \pi} \sigma }y^2 e^{-\frac12 \cdot 2\pi k y^2} \d y \\ &&&=\pi k \sqrt{2 \pi}\sigma \mathbb{E}\left (N(0, \sigma^2)^2 \right) \\ &&&= \pi k \sqrt{2 \pi}\sigma\sigma^2 \\ &&&= \pi k \sqrt{2 \pi} \frac{1}{(2k\pi)^{3/2}} \\ &&&= \frac{1}{2\sqrt{k}} \end{align*} \begin{align*} && \mathbb{E}(Y^2) &= \int_0^\infty y^2f_Y(y) \d y \\ &&&= \int_0^\infty 2\pi y^3 k e^{-\pi k y^2} \d y \\ &&&= \int_0^{\infty}y^2 2y \pi k e^{-\pi k y^2} \d y \\ \\ &&&= \left [-y^2 e^{-\pi k y^2}\right]_0^{\infty}+\int_0^\infty 2ye^{-\pi k y^2} \d y \\ &&&= \left [-\frac{1}{\pi k}e^{-\pi k y^2} \right]_0^{\infty} \\ &&&= \frac{1}{\pi k} \\ \Rightarrow && \textrm{Var}(Y) &= \mathbb{E}(Y^2) - \left [ \mathbb{E}(Y)\right]^2 \\ &&&= \frac{1}{\pi k} - \frac{1}{4k} \\ &&&= \frac{4 - \pi}{4\pi k} \end{align*}

2012 Paper 3 Q13
D: 1700.0 B: 1484.0

  1. The random variable \(Z\) has a Normal distribution with mean \(0\) and variance \(1\). Show that the expectation of \(Z\) given that \(a < Z < b\) is \[ \frac{\exp(- \frac12 a^2) - \exp(- \frac12 b^2) } {\sqrt{2\pi\,} \,\big(\Phi(b) - \Phi(a)\big)}, \] where \(\Phi\) denotes the cumulative distribution function for \(Z\).
  2. The random variable \(X\) has a Normal distribution with mean \(\mu\) and variance \(\sigma^2\). Show that \[ \E(X \,\vert\, X>0) = \mu + \sigma \E(Z \,\vert\,Z > -\mu/\sigma). \] Hence, or otherwise, show that the expectation, \(m\), of \(\vert X\vert \) is given by \[ m= \mu \big(1 - 2 \Phi(- \mu / \sigma)\big) + \sigma \sqrt{2 / \pi}\; \exp(- \tfrac12 \mu^2 / \sigma^2) \,. \] Obtain an expression for the variance of \(\vert X \vert\) in terms of \(\mu \), \(\sigma \) and \(m\).


Solution:

  1. \(\,\) \begin{align*} && \mathbb{E}(Z| a < Z < b) &= \mathbb{E}(Z\mathbb{1}_{(a,b)}) /\mathbb{E}(\mathbb{1}_{(a,b)}) \\ &&&= \int_a^b z \phi(z) \d z \Big / (\Phi(b) - \Phi(a)) \\ &&&= \frac{\int_a^b \frac{1}{\sqrt{2 \pi}}z e^{-\frac12 z^2} \d z}{\Phi(b) - \Phi(a)} \\ &&&= \frac{\frac1{\sqrt{2\pi}} \left [-e^{-\frac12 z^2} \right]_a^b}{\Phi(b) - \Phi(a)} \\ &&&= \frac{\frac1{\sqrt{2\pi}} \left (e^{-\frac12 a^2}-e^{-\frac12 b^2} \right)}{\Phi(b) - \Phi(a)} \\ \end{align*}
  2. \(\,\) \begin{align*} && \mathbb{E}(X |X > 0) &= \mathbb{E}(\mu + \sigma Z | \mu + \sigma Z > 0) \\ &&&= \mathbb{E}(\mu + \sigma Z | Z > -\tfrac{\mu}{\sigma}) \\ &&&= \mathbb{E}(\mu| Z > -\tfrac{\mu}{\sigma})+ \sigma \mathbb{E}(Z | Z > -\tfrac{\mu}{\sigma})\\ &&&= \mu+ \sigma \mathbb{E}(Z | Z > -\tfrac{\mu}{\sigma})\\ \end{align*} Hence \begin{align*} &&\mathbb{E}(|X|) &= \mathbb{E}(X | X > 0)\mathbb{P}(X > 0) - \mathbb{E}(X | X < 0)\mathbb{P}(X < 0) \\ &&&=\left ( \mu+ \sigma \mathbb{E}(Z | Z > -\mu /\sigma)\right)(1-\Phi(-\mu/\sigma)) - \left ( \mu+ \sigma \mathbb{E}(Z | Z < -\mu /\sigma)\right)\Phi(-\mu/\sigma) \\ &&&= \mu(1 - 2\Phi(-\mu/\sigma)) + \sigma \frac{e^{-\frac12\mu^2/\sigma^2}}{\sqrt{2\pi}(1-\Phi(-\mu/\sigma))}(1-\Phi(-\mu/\sigma)) + \sigma \frac{e^{-\frac12\mu^2/\sigma^2}}{\sqrt{2 \pi} \Phi(-\mu/\sigma)} \Phi(-\mu/\sigma) \\ &&&= \mu(1 - 2\Phi(-\mu/\sigma)) + \sigma \sqrt{\frac{2}{\pi}} \exp(-\tfrac12 \mu^2/\sigma^2) \end{align*} Finally, \begin{align*} && \textrm{Var}(|X|) &= \mathbb{E}(|X|^2) - [\mathbb{E}(|X|)]^2 \\ &&&= \mu^2 + \sigma^2 - m^2 \end{align*}

2011 Paper 3 Q12
D: 1700.0 B: 1516.0

The random variable \(N\) takes positive integer values and has pgf (probability generating function) \(\G(t)\). The random variables \(X_i\), where \(i=1\), \(2\), \(3\), \(\ldots,\) are independently and identically distributed, each with pgf \({H}(t)\). The random variables \(X_i\) are also independent of \(N\). The random variable \(Y\) is defined by \[ Y= \sum_{i=1}^N X_i \;. \] Given that the pgf of \(Y\) is \(\G(H(t))\), show that \[ \E(Y) = \E(N)\E(X_i) \text{ and } \var(Y) = \var(N)\big(\E(X_i)\big)^2 + \E(N) \var(X_i) \,.\] A fair coin is tossed until a head occurs. The total number of tosses is \(N\). The coin is then tossed a further \(N\) times and the total number of heads in these \(N\) tosses is \(Y\). Find in this particular case the pgf of \(Y\), \(\E(Y)\), \(\var(Y)\) and \(\P(Y=r)\).


Solution: Recall that for a random variable \(Z\) with pgf \(F(t)\) we have \(F(1) = 1\), \(\E[Z] = F'(1)\) and \(\E[Z^2] = F''(1) +F'(1)\) so \begin{align*} && \E[Y] &= G'(H(1))H'(1) \\ &&&= G'(1)H'(1) \\ &&&= \E[N]\E[X_i] \\ \\ && \E[Y^2] &= G''(H(1))(H'(1))^2+G'(H(1))H''(1) + G'(H(1))H'(1) \\ &&&= G''(1)(H'(1))^2+G'(1)H''(1) + G'(1)H'(1) \\ &&&= (\E[N^2]-\E[N])(\E[X_i])^2 + \E[N](\E[X_i^2]-\E[X_i]) + \E[N]\E[X_i] \\ &&&= (\E[N^2]-\E[N])(\E[X_i])^2 + \E[N]\E[X_i^2] \\ && \var[Y] &= (\E[N^2]-\E[N])(\E[X_i])^2 + \E[N]\E[X_i^2] - (\E[N])^2(\E[X_i])^2\\ &&&= (\var[N]+(\E[N])^2-\E[N])(\E[X_i])^2 + \E[N](\var[X_i]+\E[X_i]^2) - (\E[N])^2(\E[X_i])^2\\ &&&= \var[N](\E[X_i])^2 + \E[N]\var[X_i] \end{align*} Notice that \(N \sim Geo(\tfrac12)\) and \(Y = \sum_{i=1}^N X_i\) where \(X_i\) are Bernoulli. We have that \(G(t) = \frac{\frac12}{1-\frac12z}\) and \(H(t) = \frac12+\frac12p\) so the pgf of \(Y\) is \(G(H(t) = \frac{\frac12}{1 - \frac14-\frac14p} = \frac{2}{3-p}\). \begin{align*} && \E[X_i] &= \frac12\\ && \var[X_i] &= \frac14 \\ && \E[N] &= 2 \\ && \var[N] &= 2 \\ \\ && \E[Y] &= 2 \cdot \frac12 = 1 \\ && \var[Y] &= 2 \cdot \frac14 + 2 \frac14 = 1 \\ && \mathbb{P}(Y=r) &= \tfrac23 \left ( \tfrac13 \right)^r \end{align*}

2010 Paper 3 Q13
D: 1700.0 B: 1516.0

In this question, \({\rm Corr}(U,V)\) denotes the product moment correlation coefficient between the random variables \(U\) and \(V\), defined by \[ \mathrm{Corr}(U,V) \equiv \frac{\mathrm{Cov}(U,V)}{\sqrt{\var(U)\var(V)}}\,. \] The independent random variables \(Z_1\), \(Z_2\) and \(Z_3\) each have expectation 0 and variance 1. What is the value of \(\mathrm{Corr} (Z_1,Z_2)\)? Let \(Y_1 = Z_1\) and let \[ Y_2 = \rho _{12} Z_1 + (1 - {\rho_{12}^2})^{ \frac12} Z_ 2\,, \] where \(\rho_{12}\) is a given constant with $-1<\rho _{12}<1$. Find \(\E(Y_2)\), \(\var(Y_2)\) and \(\mathrm{Corr}(Y_1, Y_2)\). Now let \(Y_3 = aZ_1 + bZ_2 + cZ_3\), where \(a\), \(b\) and \(c\) are real constants and \(c\ge0\). Given that \(\E(Y_3) = 0\), \(\var(Y_3) = 1\), \( \mathrm{Corr}(Y_1, Y_3) =\rho^{{2}}_{13} \) and \( \mathrm{Corr}(Y_2, Y_3)= \rho^{{2}} _{23}\), express \(a\), \(b\) and \(c\) in terms of \(\rho^{2} _{23}\), \(\rho^{2}_{13}\) and \(\rho^{2} _{12}\). Given constants \(\mu_i\) and \(\sigma_i\), for \(i=1\), \(2\) and \(3\), give expressions in terms of the \(Y_i\) for random variables \(X_i\) such that \(\E(X_i) = \mu_i\), \(\var(X_i) = \sigma_ i^2\) and \(\mathrm{Corr}(X_i,X_j) = \rho_{ij}\).


Solution: \begin{align*} \mathrm{Corr} (Z_1,Z_2) &= \frac{\mathrm{Cov}(Z_1,Z_2)}{\sqrt{\var(Z_1)\var(Z_2)}} \\ &= \frac{\mathbb{E}(Z_1 Z_2)}{\sqrt{1 \cdot 1}} \\ &= \frac{\mathbb{E}(Z_1)\mathbb{E}(Z_2)}{\sqrt{1 \cdot 1}} \\ &= \frac{0}{1} \\ &= 0 \end{align*} \begin{align*} && \mathbb{E}(Y_2) &= \mathbb{E}(\rho_{12} Z_1 + (1 - {\rho_{12}^2})^{ \frac12} Z_ 2) \\ &&&= \mathbb{E}(\rho_{12} Z_1) + \mathbb{E}( (1 - {\rho_{12}^2})^{ \frac12} Z_ 2) \\ &&&= \rho_{12}\mathbb{E}( Z_1) + (1 - {\rho_{12}^2})^{ \frac12}\mathbb{E}( Z_ 2) \\ &&&= 0\\ \\ && \textrm{Var}(Y_2) &= \textrm{Var}(\rho _{12} Z_1 + (1 - {\rho_{12}^2})^{ \frac12} Z_ 2) \\ &&&= \textrm{Var}(\rho_{12} Z_1)+\textrm{Cov}(\rho_{12} Z_1,(1 - {\rho_{12}^2})^{ \frac12} Z_ 2 ) + \textrm{Var}((1 - {\rho_{12}^2})^{ \frac12} Z_ 2) \\ &&&= \rho_{12}^2\textrm{Var}( Z_1)+\rho_{12} (1 - {\rho_{12}^2})^{ \frac12} \textrm{Cov}(Z_1, Z_ 2 ) + (1 - {\rho_{12}^2})\textrm{Var}(Z_ 2) \\ &&&= \rho_{12}^2 + (1-\rho_{12}^2) = 1 \\ \\ && \textrm{Cov}(Y_1, Y_2) &= \mathbb{E}((Y_1-0)(Y_2-0)) \\ &&&= \mathbb{E}(Z_1 \cdot (\rho _{12} Z_1 + (1 - {\rho_{12}^2})^{ \frac12} Z_ 2)) \\ &&&= \rho_{12} \mathbb{E}(Z_1^2) + (1-\rho_{12}^2)^{\frac12}\mathbb{E}(Z_1, Z_2) \\ &&&= \rho_{12} \\ \Rightarrow && \textrm{Corr}(Y_1, Y_2) &= \frac{\textrm{Cov}(Y_1, Y_2)}{\sqrt{\textrm{Var}(Y_1)\textrm{Var}(Y_2)}} \\ &&&= \frac{\rho_{12}}{1 \cdot 1} = \rho_{12} \end{align*} Suppose \(Y_3 =aZ_1 +bZ_2+cZ_3\) with \(\mathbb{E}(Y_3) = 0\) (must be true), \(\textrm{Var}(Y_3) = 1 = a^2+b^2+c^2\) and \(\textrm{Corr}(Y_1, Y_3) = \rho_{13}, \textrm{Corr}(Y_2, Y_3) = \rho_{23}\). \begin{align*} && \textrm{Corr}(Y_1,Y_3) &= \textrm{Cov}(Y_1, Y_3) \\ &&&= \textrm{Cov}(Z_1, aZ_1 +bZ_2+cZ_3) \\ &&&= a \\ \Rightarrow && a &= \rho_{13} \\ \\ && \textrm{Corr}(Y_2,Y_3) &= \textrm{Cov}(Y_2, Y_3) \\ &&&= \textrm{Cov}(\rho_{12}Z_1+(1-\rho_{12}^2)^\frac12Z_2, \rho_{13}Z_1 +bZ_2+cZ_3) \\ &&&= \rho_{12}\rho_{13}+(1-\rho_{12}^2)^\frac12b \\ \Rightarrow && \rho_{23} &= \rho_{12}\rho_{13}+(1-\rho_{12}^2)^\frac12b \\ \Rightarrow && b &= \frac{\rho_{23}-\rho_{12}\rho_{13}}{(1-\rho_{12}^2)^\frac12} \\ && c &= \sqrt{1-\rho_{13}^2-\frac{(\rho_{23}-\rho_{12}\rho_{13})^2}{(1-\rho_{12}^2)}} \end{align*} Finally, let \(X_i = \mu_i + \sigma_i Y_i\)

2009 Paper 2 Q12
D: 1600.0 B: 1496.6

A continuous random variable \(X\) has probability density function given by \[ \f(x) = \begin{cases} 0 & \mbox{for } x<0 \\ k\e^{-2 x^2} & \mbox{for } 0\le x< \infty \;,\\ \end{cases} \] where \(k\) is a constant.

  1. Sketch the graph of \(\f(x)\).
  2. Find the value of \(k\).
  3. Determine \(\E(X)\) and \(\var(X)\).
  4. Use statistical tables to find, to three significant figures, the median value of \(X\).


Solution:

  1. \par
    TikZ diagram
  2. Let \(Y \sim N(0,\frac14)\), then: \begin{align*} &&\int_0^\infty \frac{1}{\sqrt{2 \pi \cdot \frac14}} e^{-2x^2} \, dx &= \frac12\\ \Rightarrow && \int_0^\infty e^{-2x^2} &= \frac{\sqrt{\pi}}{2 \sqrt{2}} \\ \Rightarrow && k &= \boxed{\frac{2\sqrt{2}}{\sqrt{\pi}}} \end{align*}
  3. \begin{align*} \mathbb{E}[X] &= \int_0^\infty x f(x) \, dx \\ &= \frac{2\sqrt{2}}{\sqrt{\pi}}\int_0^\infty x e^{-2x^2}\, dx \\ &= \frac{2\sqrt{2}}{\sqrt{\pi}} \left [-\frac{1}{4}e^{-2x^2} \right]_0^\infty \\ &= \frac{1}{\sqrt{2\pi}} \\ \end{align*} In order to calculate \(\mathbb{E}(X^2)\) it is useful to consider the related computation \(\mathbb{E}(Y^2)\). In fact, by symmetry, these will be the same values. Therefore \(\mathbb{E}(X^2) = \mathbb{E}(Y^2) = \mathrm{Var}(Y) = \frac{1}{4}\) (since \(\mathbb{E}(Y) = 0\)). Therefore \(\mathrm{Var}(Y) = \mathbb{E}(Y^2) - \mathbb{E}(Y)^2 = \frac14 - \frac{1}{2\pi}\)
  4. \begin{align*} && \mathbb{P}(X < x) &= \frac12 \\ \Leftrightarrow && 2\mathbb{P}(0 \leq Y < x) &= \frac12 \\ \Leftrightarrow && 2\l \mathbb{P}(Y < x) - \frac12 \r &= \frac12 \\ \Leftrightarrow && \mathbb{P}(Y < x)&= \frac34 \\ \Leftrightarrow && \mathbb{P}(\frac{Y-0}{1/2} < \frac{x}{1/2})&= \frac34 \\ \Leftrightarrow && \mathbb{P}(Z < \frac{x}{1/2})&= \frac34 \\ \Leftrightarrow && \Phi(2x)&= \frac34 \\ \Leftrightarrow && 2x &= 0.6744895\cdots \\ \Leftrightarrow && x &= 0.3372\cdots \\ \Leftrightarrow && &= 0.337 \, (3 \text{sf}) \\ \end{align*}

2009 Paper 3 Q12
D: 1700.0 B: 1516.0

  1. Albert tosses a fair coin \(k\) times, where \(k\) is a given positive integer. The number of heads he gets is \(X_1\). He then tosses the coin \(X_1\) times, getting \(X_2\) heads. He then tosses the coin \(X_2\) times, getting \(X_3\) heads. The random variables \(X_4\), \(X_5\), \(\ldots\) are defined similarly. Write down \(\E(X_1)\). By considering \(\E(X_2 \; \big\vert \; X_1 = x_1)\), or otherwise, show that \(\E(X_2) = \frac14 k\). Find \(\displaystyle \sum_{i=1}^\infty \E(X_i)\).
  2. Bertha has \(k\) fair coins. She tosses the first coin until she gets a tail. The number of heads she gets before the first tail is \(Y_1\). She then tosses the second coin until she gets a tail and the number of heads she gets with this coin before the first tail is \(Y_2\). The random variables \(Y_3, Y_4, \ldots\;\), \(Y_k\) are defined similarly, and \(Y= \sum\limits_{i=1}^k Y_i\,\). Obtain the probability generating function of \(Y\), and use it to find \(\E(Y)\), \(\var(Y)\) and \(\P(Y=r)\).


Solution:

  1. \(X_1 \sim B(k, \tfrac12)\) so \(\E[X_1] = \frac{k}{2}\) Note that \(X_2 | X_1 = x_1 \sim B(x_1, \tfrac12)\) so \(\E[X_2 | X_1 = x_1) = \frac{x_1}{2}\) or \(\E[X_2 | X_1] = \frac12 X_1\). Therefore by the tower law, \(\E[\E[X_2|X_1]] = \E[\frac12 X_1] = \frac14k\) Notice also that \(\E[X_n] = \frac1{2^n} k\) and so \begin{align*} && \sum_{i=1}^\infty \E[X_i] &= \sum_{i=1}^{\infty} \frac1{2^i} k \\ &&&= \frac{\frac12 k}{1-\frac12} = k \end{align*}
  2. Note that \(Y_1 \sim Geo(\tfrac12)-1\) which has generating function \(\E[t^{Y_1}] = \E[t^{G-1}] = \frac{\frac12 t}{1-(1-\frac12)t}\frac1{t} = \frac{\frac12}{1-\frac12t}\). Notice that \begin{align*} && \E \left [ t^Y \right] &= \E \left [ t^{\sum_{i=1}^kY_i} \right] \\ &&&= \prod_{i=1}^k \E[t^{Y_i}] \\ &&&= \frac{1}{(2-t)^k} \end{align*} Therefore \(\E[Y] = G'(1) = k(2-1)^{-(k+1)} = k\) \(\E[Y^2] = (tG'(t))'|_{t=1} = k(k+1)(2-1)^{-(k+2)}+k(2-1)^{-(k+1)} = k^2+2k\) so \(\var[Y] = k^2+2k - k^2 =2 k\). Finally \(\mathbb{P}(Y=r) = \binom{k+r-1}{k} \frac{1}{2^{r+k}}\)
[Note: this second distribution is a negative binomial distribution]