Problems

Filters
Clear Filters

11 problems found

2023 Paper 3 Q11
D: 1500.0 B: 1500.0

Show that \[\sum_{k=1}^{\infty} \frac{k+1}{k!}\, x^k = (x+1)\mathrm{e}^x - 1\,.\] In the remainder of this question, \(n\) is a fixed positive integer.

  1. Random variable \(Y\) has a Poisson distribution with mean \(n\). One observation of \(Y\) is taken. Random variable \(D\) is defined as follows. If the observed value of \(Y\) is zero then \(D = 0\). If the observed value of \(Y\) is \(k\), where \(k \geqslant 1\), then a fair \(k\)-sided die (with sides numbered \(1\) to \(k\)) is rolled once and \(D\) is the number shown on the die.
    1. Write down \(\mathrm{P}(D = 0)\).
    2. Show, from the definition of the expectation of a random variable, that \[\mathrm{E}(D) = \sum_{d=1}^{\infty} \left[ d \sum_{k=d}^{\infty} \left( \frac{1}{k} \cdot \frac{n^k}{k!}\, \mathrm{e}^{-n} \right) \right].\] Show further that \[\mathrm{E}(D) = \sum_{k=1}^{\infty} \left( \frac{1}{k} \cdot \frac{n^k}{k!}\, \mathrm{e}^{-n} \sum_{d=1}^{k} d \right).\]
    3. Show that \(\mathrm{E}(D) = \frac{1}{2}(n + 1 - \mathrm{e}^{-n})\).
  2. Random variables \(X_1, X_2, \ldots, X_n\) all have Poisson distributions. For each \(k \in \{1, 2, \ldots, n\}\), the mean of \(X_k\) is \(k\). A fair \(n\)-sided die, with sides numbered \(1\) to \(n\), is rolled. When \(k\) is the number shown, one observation of \(X_k\) is recorded. Let \(Z\) be the number recorded.
    1. Find \(\mathrm{P}(Z = 0)\).
    2. Show that \(\mathrm{E}(Z) > \mathrm{E}(D)\).

2021 Paper 3 Q12
D: 1500.0 B: 1500.0

  1. In a game, each member of a team of \(n\) players rolls a fair six-sided die. The total score of the team is the number of pairs of players rolling the same number. For example, if \(7\) players roll \(3, 3, 3, 3, 6, 6, 2\) the total score is \(7\), as six different pairs of players both score \(3\) and one pair of players both score \(6\). Let \(X_{ij}\), for \(1 \leqslant i < j \leqslant n\), be the random variable that takes the value \(1\) if players \(i\) and \(j\) roll the same number and the value \(0\) otherwise. Show that \(X_{12}\) is independent of \(X_{23}\). Hence find the mean and variance of the team's total score.
  2. Show that, if \(Y_i\), for \(1 \leqslant i \leqslant m\), are random variables with mean zero, then \[ \mathrm{Var}(Y_1 + Y_2 + \cdots + Y_m) = \sum_{i=1}^{m} \mathrm{E}(Y_i^2) + 2\sum_{i=1}^{m-1}\sum_{j=i+1}^{m} \mathrm{E}(Y_i Y_j). \]
  3. In a different game, each member of a team of \(n\) players rolls a fair six-sided die. The total score of the team is the number of pairs of players rolling the same even number minus the number of pairs of players rolling the same odd number. For example, if \(7\) players roll \(3, 3, 3, 3, 6, 6, 2\) the total score is \(-5\). Let \(Z_{ij}\), for \(1 \leqslant i < j \leqslant n\), be the random variable that takes the value \(1\) if players \(i\) and \(j\) roll the same even number, the value \(-1\) if players \(i\) and \(j\) roll the same odd number and the value \(0\) otherwise. Show that \(Z_{12}\) is not independent of \(Z_{23}\). Find the mean of the team's total score and show that the variance of the team's total score is \(\dfrac{1}{36}n(n^2 - 1)\).


Solution:

  1. First note that \(\mathbb{P}(X_{ij} = 1) = \frac16\) since it doesn't matter what \(i\) rolls, it only matters that \(j\) rolls the same thing, which happens \(1/6\) of the time. \begin{align*} && \mathbb{P}(X_{12} = 1, X_{23} = 1) &= \mathbb{P}(1, 2\text{ and }3\text{ all roll the same})\\ &&&= \frac{6}{6^3}= \frac1{6^2} \\ &&&= \mathbb{P}(X_{12} = 1)\mathbb{P}(X_{23} = 1) \\ && \mathbb{P}(X_{12} = 1, X_{23} = 0) &= \mathbb{P}(1, 2\text{ roll the same and }3\text{ rolls different}) \\ &&&= \frac{6 \cdot 1 \cdot 5}{6^3} = \frac{5}{6^2} \\ &&&= \mathbb{P}(X_{12} = 1)\mathbb{P}(X_{23} = 0) \\ && \mathbb{P}(X_{12} = 0, X_{23} = 0) &= \mathbb{P}(2, 3 \text{ roll different to} 2)\\ &&&= \frac{6 \cdot 5 \cdot 5}{6^3}= \frac{5^2}{6^2} \\ &&&= \mathbb{P}(X_{12} = 0)\mathbb{P}(X_{23} = 0) \end{align*} Therefore they are independent (the final case is clear by symmetry from case 2). Note that the score is \(S = \sum_{i \neq j} X_{ij}\) so \begin{align*} && \E[S] &= \E \left [ \sum_{i \neq j} X_{ij} \right] \\ &&&= \sum_{i \neq j} \E \left [ X_{ij} \right] \\ &&&= \sum_{i \neq j} \frac16 \\ &&&= \binom{n}{2} \frac16 = \frac{n(n-1)}{12} \\ \\ && \var[S] &= \var \left [ \sum_{i \neq j} X_{ij} \right] \\ &&& \sum_{i \neq j} \var \left [X_{ij} \right] \tag{pairwise ind.} \\ &&&= \binom{n}{2} \frac{5}{36} = \frac{5n(n-1)}{72} \end{align*}
  2. Note that \(\mathbb{P}(Z_{ij} = 1)=\mathbb{P}(Z_{ij} = -1) = \frac{3}{6^2} = \frac{1}{12}\) but that \(\mathbb{P}(Z_{12} = 1, Z_{23} = -1) = 0\). Notice that \(Z_{12}Z_{23}\) is either \(1\) or \(0\) (since \(2\) can't be both odd and even). \(\mathbb{P}(Z_{12}Z_{23} = 1) = \frac{1}{36}\). Notice that \(Z_{ij}, Z_{kl}\) are independent if \(i \neq j \neq k \neq l\) and so \begin{align*} && \E[T] &= \E \left [ \sum_{i \neq j} Z_{ij} \right] \\ &&&= \sum_{i \neq j}\E \left [ Z_{ij} \right] \\ &&&= 0 \\ \\ && \E[T^2] &= \E \left [ \left ( \sum_{i \neq j} Z_{ij} \right)^2 \right] \\ &&&= \E \left [ \sum_{i \neq j} Z_{ij}^2 + \sum_{i \neq j \neq k} Z_{ij}Z_{jk} + \sum_{i \neq j \neq k \neq l} Z_{ij}Z_{kl}\right] \\ &&&= \binom{n}{2} \frac{1}{6} + 2\frac{n(n-1)(n-2)}{2} \frac{1}{36} + 0 \\ &&&= \frac{n(n-1)}{12} + \frac{n(n-1)(n-2)}{6} \\ &&&= \frac{n(n-1)[3 + (n-2)]}{36} \\ &&&= \frac{n(n^2-1)}{36} \end{align*}

2019 Paper 2 Q11
D: 1500.0 B: 1500.0

  1. The three integers \(n_1\), \(n_2\) and \(n_3\) satisfy \(0 < n_1 < n_2 < n_3\) and \(n_1 + n_2 > n_3\). Find the number of ways of choosing the pair of numbers \(n_1\) and \(n_2\) in the cases \(n_3 = 9\) and \(n_3 = 10\). Given that \(n_3 = 2n + 1\), where \(n\) is a positive integer, write down an expression (which you need not prove is correct) for the number of ways of choosing the pair of numbers \(n_1\) and \(n_2\). Simplify your expression. Write down and simplify the corresponding expression when \(n_3 = 2n\), where \(n\) is a positive integer.
  2. You have \(N\) rods, of lengths \(1, 2, 3, \ldots, N\) (one rod of each length). You take the rod of length \(N\), and choose two more rods at random from the remainder, each choice of two being equally likely. Show that, in the case \(N = 2n + 1\) where \(n\) is a positive integer, the probability that these three rods can form a triangle (of non-zero area) is $$\frac{n - 1}{2n - 1}.$$ Find the corresponding probability in the case \(N = 2n\), where \(n\) is a positive integer.
  3. You have \(2M + 1\) rods, of lengths \(1, 2, 3, \ldots, 2M + 1\) (one rod of each length), where \(M\) is a positive integer. You choose three at random, each choice of three being equally likely. Show that the probability that the rods can form a triangle (of non-zero area) is $$\frac{(4M + 1)(M - 1)}{2(2M + 1)(2M - 1)}.$$ Note: \(\sum_{k=1}^{K} k^2 = \frac{1}{6}K(K + 1)(2K + 1)\).


Solution:

  1. If \(n_3 = 9\) and we are looking for \(0 < n_1 < n_2 < n_3\) we can consider values for each \(n_2\). \begin{array}{clc|c} n_2 & \text{range} & \text{count} \\ \hline 6 & 4-5 & 2 \\ 7 & 3-6 & 4 \\ 8 & 2-7 & 6 \\ \hline & & 12 \end{array} When \(n_3 = 10\) \begin{array}{clc|c} n_2 & \text{range} & \text{count} \\ \hline 6 & 5 & 1 \\ 7 & 4-6 & 3 \\ 8 & 3-7 & 5 \\ 9 & 2-8 & 7 \\ \hline & & 16 \end{array} When \(n_3 = 2n+1\) we can have \(2 + 4 + \cdots + 2n-2 = n(n-1)\) When \(n_3 = 2n\) we can have \(1 + 3 + \cdots + 2n-3 = (n-1)^2\)
  2. For the 3 rods to form a triangle, it suffices for the sum of the lengths of the shorter rods to be larger than \(N\). When \(N = 2n+1\) there are \(n(n-1)\) ways this can happen, out of \(\binom{2n}{2}\) ways to choos the numbers, ie \begin{align*} && P &= \frac{n(n-1)}{\frac{2n(2n-1)}{2}} \\ &&&= \frac{n-1}{2n-1} \end{align*} When \(N = 2n\) there are \((n-1)^2\) ways this can happen, out of \(\binom{2n-1}{2}\) ways, ie \begin{align*} && P &= \frac{(n-1)^2}{\frac{(2n-1)(2n-2)}{2}} \\ &&&= \frac{n-1}{2n-1} \end{align*}
  3. The number of ways this can happen is: \begin{align*} C &= \sum_{k=3}^{2M+1} \# \{ \text{triangles where }k\text{ is largest} \} \\ &= \sum_{k=1}^{M} \# \{ \text{triangles where }2k+1\text{ is largest} \} +\sum_{k=1}^{M} \# \{ \text{triangles where }2k\text{ is largest} \}\\ &= \sum_{k=1}^{M} n(n-1)+\sum_{k=1}^{M} (n-1)^2\\ &= \sum_{k=1}^{M} (2n^2-3n+1)\\ &= \frac26M(M+1)(2M+1) - \frac32M(M+1) + M \\ &= \frac16 M(4M+1)(M-1) \end{align*} Therefore the probability is \begin{align*} && P &= \frac{M(4M+1)(M-1)}{6 \binom{2M+1}{3}} \\ &&&= \frac{M(4M+1)(M-1)}{(2M+1)2M(2M-1)} \\ &&&= \frac{(4M+1)(M-1)}{2(2M+1)(2M-1)} \end{align*}

2017 Paper 3 Q12
D: 1700.0 B: 1500.2

The discrete random variables \(X\) and \(Y\) can each take the values \(1\), \(\ldots\,\), \(n\) (where \(n\ge2\)). Their joint probability distribution is given by \[ \P(X=x, \ Y=y) = k(x+y) \,, \] where \(k\) is a constant.

  1. Show that \[ \P(X=x) = \dfrac{n+1+2x}{2n(n+1)}\,. \] Hence determine whether \(X\) and \(Y\) are independent.
  2. Show that the covariance of \(X\) and \(Y\) is negative.


Solution:

  1. \(\,\) \begin{align*} && \mathbb{P}(X = x) &= \sum_{y=1}^n \mathbb{P}(X=x,Y=y) \\ &&&= \sum_{y=1}^n k(x+y) \\ &&&= nkx + k\frac{n(n+1)}2 \\ \\ && 1 &= \sum_{x=1}^n \mathbb{P}(X=x) \\ &&&= nk\frac{n(n+1)}{2} + kn\frac{n(n+1)}2 \\ &&&= kn^2(n+1) \\ \Rightarrow && k &= \frac{1}{n^2(n+1)} \\ \Rightarrow && \mathbb{P}(X = x) &= \frac{nx}{n^2(n+1)} + \frac{n(n+1)}{2n^2(n+1)} \\ &&&= \frac{n+1+2x}{2n(n+1)} \\ \\ && \mathbb{P}(X=x)\mathbb{P}(Y=y) &= \frac{(n+1)^2+2(n+1)(x+y)+4xy}{4n^2(n+1)^2} \\ &&&\neq \frac{x+y}{n^2(n+1)} \end{align*} Therefore \(X\) and \(Y\) are not independent.
  2. \(\,\) \begin{align*} && \E[X] &= \sum_{x=1}^n x \mathbb{P}(X=x) \\ &&&= \sum_{x=1}^n x \mathbb{P}(X=x)\\ &&&= \sum_{x=1}^n x \frac{n+1+2x}{2n(n+1)} \\ &&&= \frac{1}{2n(n+1)} \left ( (n+1) \sum x + 2\sum x^2\right)\\ &&&= \frac{1}{2n(n+1)} \left ( \frac{n(n+1)^2}{2} + \frac{n(n+1)(2n+1)}{3} \right) \\ &&&= \frac{1}{2} \left ( \frac{n+1}{2} + \frac{2n+1}{3} \right)\\ &&&= \frac{1}{2} \left ( \frac{7n+5}{6} \right)\\ &&&= \frac{7n+5}{12} \\ \\ && \textrm{Cov}(X,Y) &= \mathbb{E}\left[XY\right] - \E[X] \E[Y] \\ &&&= \sum_{x=1}^n \sum_{y=1}^n xy \frac{x+y}{n^2(n+1)} - \E[X]^2 \\ &&&= \frac{1}{n^2(n+1)} \sum \sum (x^2 y+xy^2) - \E[X]^2 \\ &&&= \frac{1}{n^2(n+1)} \left (\sum y \right )\left (\sum x^2\right ) - \E[X]^2 \\ &&&=\frac{(n+1)(2n+1)}{12} - \left ( \frac{7n+5}{12}\right)^2 \\ &&&= \frac1{144} \left (12(2n^2+3n+1) - (49n^2+70n+25) \right)\\ &&&= \frac{1}{144} \left (-25n^2-34n-13 \right) \\ &&& < 0 \end{align*} since \(\Delta = 34^2 - 4 \cdot 25 \cdot 13 = 4(17^2-25 \times 13) = -4 \cdot 36 < 0\)

2008 Paper 1 Q12
D: 1516.0 B: 1484.0

In this question, you may use without proof the results: \[ \sum_{r=1}^n r = \tfrac12 n(n+1) \qquad\text{and}\qquad \sum_{r=1}^n r^2 = \tfrac1 6 n(n+1)(2n+1)\,. \] The independent random variables \(X_1\) and \(X_2\) each take values \(1\), \(2\), \(\ldots\), \(N\), each value being equally likely. The random variable \(X\) is defined by \[ X= \begin{cases} X_1 & \text { if } X_1\ge X_2\\ X_2 & \text { if } X_2\ge X_1\;. \end{cases} \]

  1. Show that \(\P(X=r) = \dfrac{2r-1}{N^2}\,\) for \(r=1\), \(2\), \(\ldots\), \(N\).
  2. Find an expression for the expectation, \(\mu\), of \(X\) and show that \(\mu=67.165\) in the case \(N=100\).
  3. The median, \(m\), of \(X\) is defined to be the integer such that \(\P(X\ge m) \ge \frac 12\) and \(\P(X\le m)\ge \frac12\). Find an expression for \(m\) in terms of \(N\) and give an explicit value for \(m\) in the case \(N=100\).
  4. Show that when \(N\) is very large, \[ \frac \mu m \approx \frac {2\sqrt2}3\,. \]


Solution: \begin{align*} \P(X = r) &= \P(X_1 = r, X_2 \leq r) + \P(X_2 = r, X_1 < r) \\ &= \P(X_1 = r) \P(X_2 \leq r) + \P(X_2 = r)\P( X_1 < r) \\ &= \frac{1}{N} \frac{r}{N} + \frac{1}{N} \frac{r-1}{N} \\ &= \frac{2r-1}{N^2} \end{align*} \begin{align*} \E(X) &= \sum_{r=1}^N r \P(X = r) \\ &= \sum_{r=1}^N \frac{2r^2 - r}{N^2} \\ &= \frac{1}{N^2} \l \frac{N(N+1)(2N+1)}{3} - \frac{N(N+1)}{2} \r \\ &= \frac{N+1}{N} \l \frac{4N-1}{6} \r \end{align*} When \(N = 100\), this is equal to \(\frac{101 \cdot 399}{6 \cdot 100} = \frac{101 \cdot 133}{200} = 67.165\) \begin{align*} &&\frac12 &\leq \P(X \leq m) \\ &&&=\sum_{r=1}^m \P(X=r) \\ &&&=\sum_{r=1}^m \frac{2r-1}{N^2} \\ &&&= \frac{1}{N^2} \l m(m+1) - m \r \\ &&&= \frac{m^2}{N^2} \\ \Rightarrow && m^2 &\geq \frac{N^2}{2} \\ \Rightarrow && m &\geq \frac{N}{\sqrt{2}} \\ \Rightarrow && m &= \left \lceil \frac{N}{\sqrt{2}} \right \rceil \end{align*} When \(N = 100\), \(100/\sqrt{2} = \sqrt{2}50\). \(\sqrt{2} > 1.4 \Rightarrow 50\sqrt{2} > 70\) \(\sqrt{2} < 1.42 \Rightarrow 50 \sqrt{2} < 71\), therefore \(\displaystyle \left \lceil \frac{100}{\sqrt{2}} \right \rceil = 71\) \begin{align*} \lim_{N \to \infty} \frac{\frac{(N+1)(4N-1)}{6N}}{ \left \lceil\frac{N}{\sqrt{2}} \right \rceil} &= \lim_{N \to \infty} \frac{\sqrt{2}}{3}\l \frac{4N^2 +3N - 1}{2N^2} \r \tag{since the floor will be irrelevant}\\ &= \lim_{N \to \infty} \frac{\sqrt{2}}{3}\l 2 + \frac{3}{2N} - \frac{1}{N^2} \r \\ &= \lim_{N \to \infty} \frac{2\sqrt{2}}{3} \end{align*}

2002 Paper 3 Q14
D: 1700.0 B: 1500.0

Prove that, for any two discrete random variables \(X\) and \(Y\), \[ \mathrm{Var} \left(X + Y \right) = \mathrm{Var}(X) + \mathrm{Var}(Y) + 2 \, \mathrm{Cov}(X,Y), \] where \(\mathrm{Var}(X)\) is the variance of \(X\) and \(\mathrm{Cov}(X,Y)\) is the covariance of \(X\) and \(Y\). When a Grandmaster plays a sequence of \(m\) games of chess, she is, independently, equally likely to win, lose or draw each game. If the values of the random variables \(W\), \(L\) and \(D\) are the numbers of her wins, losses and draws respectively, justify briefly the following claims:

  1. \(W + L + D\) has variance \(0\,\);
  2. \(W + L\) has a binomial distribution.
Find the value of \(\displaystyle {\mathrm{Cov}(W,L) \over \sqrt{\mathrm{Var}(W) \mathrm{Var}(L)}}\;\).


Solution: \begin{align*} && \var[X+Y] &= \E\left [(X+Y-\E[X+Y])^2 \right] \\ &&&= \E \left [ (X - \E[X] + Y - \E[Y])^2 \right] \\ &&&= \E \left [(X - \E[X])^2 + (Y-\E[Y])^2 + 2(X-\E[X])(Y-\E[Y]) \right] \\ &&&= \E \left [(X - \E[X])^2 \right]+\E \left [(Y-\E[Y])^2 \right]+\E \left [2(X-\E[X])(Y-\E[Y]) \right] \\ &&&= \var[X] + \var[Y] + 2 \mathrm{Cov}(X,Y) \end{align*}

  1. \(W+L+D = m\) where \(m\) is the number of games, which has variance \(0\). Therefore \(W+L+D\) has variance \(0\).
  2. The probability of a decisive game is \(\frac23\) and \(W+L\) is the number of decisive games. Each game is independent so this meets the criteria for a binomial distribution.
Notice \(W+L \sim B(m, \tfrac23)\) and \(W, L, D \sim B(m, \tfrac13)\), in particular \(\var[W+L] = m \tfrac23 \tfrac13 = \tfrac29m\) and \(\var[W] = \var[D] = \var[D] = m \tfrac13 \tfrac13 = \tfrac29m\) \begin{align*} && \var[W+L] &= \var[W] + \var[L] + 2\mathrm{Cov}(W,L) \\ \Rightarrow && \mathrm{Cov}(W,L) &= -\tfrac19m \\ \Rightarrow && \frac{\mathrm{Cov}(W,L) }{\sqrt{\var[W]\var[L]}} &= -\frac12 \end{align*}

2000 Paper 3 Q14
D: 1700.0 B: 1500.0

The random variable \(X\) takes only the values \(x_1\) and \(x_2\) (where \( x_1 \not= x_2 \)), and the random variable \(Y\) takes only the values \(y_1\) and \(y_2\) (where \(y_1 \not= y_2\)). Their joint distribution is given by $$ \P ( X = x_1 , Y = y_1 ) = a \ ; \ \ \P ( X = x_1 , Y = y_2 ) = q - a \ ; \ \ \P ( X = x_2 , Y = y_1 ) = p - a \ . $$ Show that if \(\E(X Y) = \E(X)\E(Y)\) then $$ (a - p q ) ( x_1 - x_2 ) ( y_1 - y_2 ) = 0 . $$ Hence show that two random variables each taking only two distinct values are independent if \(\E(X Y) = \E(X) \E(Y)\). Give a joint distribution for two random variables \(A\) and \(B\), each taking the three values \(- 1\), \(0\) and \(1\) with probability \({1 \over 3}\), which have \(\E(A B) = \E( A)\E (B)\), but which are not independent.


Solution: \begin{align*} \mathbb{P}(X = x_1) &= a + q - a = q \\ \mathbb{P}(X = x_2) &= 1 - q \\ \mathbb{P}(Y = y_1) & = a + p - a = p \\ \mathbb{P}(Y = y_2) & = 1 - p \end{align*} \begin{align*} \mathbb{E}(X)\mathbb{E}(Y) &= \l qx_1 + (1-q)x_2 \r \l p y_1 + (1-p)y_2\r \\ &= qpx_1y_1 + q(1-p)x_1y_2 + (1-q)px_2y_1 + (1-q)(1-p)x_2y_2 \\ \mathbb{E}(XY) &= ax_1y_1 + (q-a)x_1y_2 + (p-a)x_2y_1 + (1 + a - p - q)x_2y_2 &= \end{align*} Therefore \(\mathbb{E}(XY) - \mathbb{E}(X)\mathbb{E}(Y)\) is a degree 2 polynomial in the \(x_i, y_i\). If \(x_1 = x_2\) then we have: \begin{align*} \mathbb{E}(X)\mathbb{E}(Y) &=x_1 \l p y_1 + (1-p)y_2\r \\ \mathbb{E}(XY) &= x_1(ay_1 + (q-a)y_2 + (p-a)y_1 + (1 + a - p - q)y_2) \\ &= x_1 (py_1 + (1-p)y_2) \end{align*} Therefore \(x_1 - x_2\) is a root and by symmetry \(y_1 - y_2\) is a root. Therefore it remains to check the coefficient of \(x_1y_1\) which is \(a - pq\) to complete the factorisation. For any two random variables taking two distinct values, we can find \(a, q, p\) satisfying the relations above. We also note that \(X\) and \(Y\) are independent if \(\mathbb{P}(X = x_i, Y = y_i) = \mathbb{P}(X = x_i)\mathbb{P}(Y = y_i)\). Since \(x_1 \neq x_2\) and \(y_1 \neq y_2\) and \(\E(A B) = \E( A)\E (B) \Rightarrow a = pq\). But if \(a = pq\), we have \(\mathbb{P}(X = x_1, Y = y_1) = \mathbb{P}(X = x_1)\mathbb{P}(Y = y_1)\) and all the other relations drop out similarly. Consider \begin{align*} \mathbb{P}(A = -1, B = 1) &= \frac{1}{6} \\ \mathbb{P}(A = -1, B = -1) &= \frac{1}{6} \\ \mathbb{P}(A = 0, B = 0) &= \frac{1}{3} \\ \mathbb{P}(A = 1, B = -1) &= \frac{1}{6} \\ \mathbb{P}(A = -1, B = -1) &= \frac{1}{6} \end{align*}

1996 Paper 1 Q13
D: 1500.0 B: 1527.6

I have a Penny Black stamp which I want to sell to my friend Jim, but we cannot agree a price. So I put the stamp under one of two cups, jumble them up, and let Jim guess which one it is under. If he guesses correctly, I add a third cup, jumble them up, and let Jim guess correctly, adding another cup each time. The price he pays for the stamp is \(\pounds N,\) where \(N\) is the number of cups present when Jim fails to guess correctly. Find \(\mathrm{P}(N=k)\). Show that \(\mathrm{E}(N)=\mathrm{e}\) and calculate \(\mathrm{Var}(N).\)


Solution: \begin{align*} && \mathbb{P}(N = k) &= \mathbb{P}(\text{guesses }k-1\text{ correctly then 1 wrong})\\ &&&= \frac12 \cdot \frac{1}{3} \cdots \frac{1}{k-1} \frac{k-1}{k} \\ &&&= \frac{k-1}{k!} \\ &&\mathbb{E}(N) &= \sum_{k=2}^\infty k \cdot \mathbb{P}(N=k) \\ &&&= \sum_{k=2}^{\infty} \frac{k(k-1)}{k!} \\ &&&= \sum_{k=0}^{\infty} \frac{1}{k!} = e \\ && \textrm{Var}(N) &= \mathbb{E}(N^2) - \mathbb{E}(N)^2 \\ && \mathbb{E}(N^2) &= \sum_{k=2}^{\infty} k^2 \mathbb{P}(N=k) \\ &&&= \sum_{k=2}^{\infty} \frac{k^2(k-1)}{k!} \\ &&&= \sum_{k=0}^{\infty} \frac{k+2}{k!} \\ &&&= \sum_{k=0}^{\infty} \frac{1}{k!} + 2 \sum_{k=0}^{\infty} \frac{1}{k!} = 3e \\ \Rightarrow && \textrm{Var}(N) &= 3e-e^2 \end{align*}

1991 Paper 3 Q16
D: 1700.0 B: 1504.3

The random variables \(X\) and \(Y\) take integer values \(x\) and \(y\) respectively which are restricted by \(x\geqslant1,\) \(y\geqslant1\) and \(2x+y\leqslant2a\) where \(a\) is an integer greater than 1. The joint probability is given by \[ \mathrm{P}(X=x,Y=y)=c(2x+y), \] where \(c\) is a positive constant, within this region and zero elsewhere. Obtain, in terms of \(x,c\) and \(a,\) the marginal probability \(\mathrm{P}(X=x)\) and show that \[ c=\frac{6}{a(a-1)(8a+5)}. \] Show that when \(y\) is an even number the marginal probability \(\mathrm{P}(Y=y)\) is \[ \frac{3(2a-y)(2a+2+y)}{2a(a-1)(8a+5)} \] and find the corresponding expression when \(y\) is off. Evaluate \(\mathrm{E}(Y)\) in terms of \(a\).

1990 Paper 3 Q15
D: 1700.0 B: 1482.6

An unbiased twelve-sided die has its faces marked \(A,A,A,B,B,B,B,B,B,B,B,B.\) In a series of throws of the die the first \(M\) throws show \(A,\) the next \(N\) throws show \(B\) and the \((M+N+1)\)th throw shows \(A\). Write down the probability that \(M=m\) and \(N=n\), where \(m\geqslant0\) and \(n\geqslant1.\) Find

  1. the marginal distributions of \(M\) and \(N\),
  2. the mean values of \(M\) and \(N\).
Investigate whether \(M\) and \(N\) are independent. Find the probability that \(N\) is greater than a given integer \(k\), where \(k\geqslant1,\) and find \(\mathrm{P}(N > M).\) Find also \(\mathrm{P}(N=M)\) and show that \(\mathrm{P}(N < M)=\frac{1}{52}.\)


Solution: \begin{align*} \mathbb{P}(M = m, N = n) &= \left ( \frac{3}{12} \right)^m \left ( \frac{9}{12} \right)^n \frac{3}{12} \\ &= \frac{3^n}{4^{m+n+1}} \end{align*}

  1. \begin{align*} \mathbb{P}(M = m) &= \sum_{n = 1}^{\infty} \mathbb{P}(M=m,N=n) \\ &= \sum_{n = 1}^{\infty} \frac{3^n}{4^{m+n+1}} \\ &= \frac{1}{4^{m+1}} \sum_{n = 1}^{\infty} \left ( \frac34\right)^n \\ &= \frac{1}{4^{m+1}} \frac{3/4}{1/4} \\ &= \frac{3}{4^{m+1}} \\ \\ \mathbb{P}(N = n) &= \sum_{m = 0}^{\infty} \mathbb{P}(M=m,N=n) \\ &= \sum_{m = 0}^{\infty} \frac{3^n}{4^{m+n+1}} \\ &= \frac{3^n}{4^{n+1}} \sum_{m = 0}^{\infty} \left ( \frac14\right)^n \\ &= \frac{3^n}{4^{n+1}} \frac{1}{3/4} \\ &= \frac{3^{n-1}}{4^{n}} \\ \end{align*}
  2. \(M+1 \sim Geo(\frac34) \Rightarrow \mathbb{E}(M) = \frac43 -1 = \frac13\) \(N \sim Geo(\frac14) \Rightarrow \mathbb{E}(N) = 4\)
\(M,N\) are independent since \(\mathbb{P}(M = m, N =n ) = \mathbb{P}(M=m)\mathbb{P}(N=n)\) \begin{align*} \mathbb{P}(N > k) &= \sum_{n=k+1}^{\infty} \mathbb{P}(N = n) \\ &= \sum_{n=k+1}^{\infty} \frac{3^{n-1}}{4^{n}} \\ &= \frac{3^k}{4^{k+1}} \sum_{n = 0}^{\infty} \left ( \frac34\right)^n \\ &= \frac{3^k}{4^{k+1}} \frac{1}{1/4} \\ &= \frac{3^k}{4^k} \end{align*} \begin{align*} \mathbb{P}(N > M) &= \sum_{m=0}^{\infty} \mathbb{P}(N > m) \mathbb{P}(M = m) \\ &= \sum_{m=0}^{\infty} \left (\frac34 \right)^m \frac{3}{4^{m+1}}\\ &=\sum_{m=0}^{\infty} \frac{3^{m+1}}{4^{2m+1}}\\ &= \frac{3}{4} \frac{1}{13/16} \\ &= \frac{12}{13} \\ \\ \mathbb{P}(N=M) &= \sum_{m=1}^{\infty} \mathbb{P}(N=m, M=m) \\ &= \sum_{m=1}^{\infty} \frac{3^m}{4^{2m+1}} \\ &= \frac{3}{64} \sum_{m=0}^{\infty} \left ( \frac{3}{16} \right)^m \\ &= \frac{3}{64} \frac{1}{13/16} \\ &= \frac{3}{52}\\ \\ \mathbb{P}(N < M) &= 1 - \frac34 - \frac3{52} \\ &= 1 - \frac{48}{52} - \frac{3}{52} \\ &= 1 - \frac{51}{52} \\ &= \frac{1}{52} \end{align*}

1988 Paper 2 Q15
D: 1600.0 B: 1516.0

An examination consists of several papers, which are marked independently. The mark given for each paper can be an integer from \(0\) to \(m\) inclusive, and the total mark for the examination is the sum of the marks on the individual papers. In order to make the examination completely fair, the examiners decide to allocate the mark for each paper at random, so that the probability that any given candidate will be allocated \(k\) marks \((0\leqslant k\leqslant m)\) for a given paper is \((m+1)^{-1}\). If there are just two papers, show that the probability that a given candidate will receive a total of \(n\) marks is \[ \frac{2m-n+1}{\left(m+1\right)^{2}} \] for \(m< n\leqslant2m\), and find the corresponding result for \(0\leqslant n\leqslant m\). If the examination consists of three papers, show that the probability that a given candidate will receive a total of \(n\) marks is \[ \frac{6mn-4m^{2}-2n^{2}+3m+2}{2\left(m+1\right)^{2}} \] in the case \(m< n\leqslant2m\). Find the corresponding result for \(0\leqslant n\leqslant m\), and deduce the result for \(2m< n\leqslant3m\).


Solution: In order to receive \(n\) marks over the two papers, where \(m < n \leq 2m\) the student must receive \(k\) and \(n-k\) marks in each paper. Since \(n > m\), \(n-k\) is a valid mark when \(n-k \leq m\) ie when \(n-m\leq k\), therefore the probability is: \begin{align*} \sum_{k = n-m}^m \mathbb{P}(\text{scores }k\text{ and }n-k) &= \sum_{k=n-m}^m \frac{1}{(m+1)^2} \\ &= \frac{m-(n-m-1)}{(m+1)^2} \\ &= \frac{2m-n+1}{(m+1)^2} \end{align*} If \(0 \leq n \leq m\) then we need \(n-k\) marks in the second paper to be positive, ie \(n-k \geq 0 \Rightarrow n \geq k\), so \begin{align*} \sum_{k = 0}^n \mathbb{P}(\text{scores }k\text{ and }n-k) &= \sum_{k = 0}^n \frac{1}{(m+1)^2} \\ &= \frac{n+1}{(m+1)^2} \end{align*} On the first paper, they can score any number of marks, since \(n > m\), so we must have: \begin{align*} \sum_{k=0}^m \mathbb{P}(\text{scores }k\text{ and }n-k) &= \frac{1}{m+1} \sum_{k=0}^m \mathbb{P}(\text{scores }n-k\text{ on second papers}) \\ &= \frac{1}{m+1}\l \sum_{k=0}^{n-m} \frac{2m-(n-k)+1}{(m+1)^2} +\sum_{k=n-m+1}^m \frac{n-k+1}{(m+1)^2}\r \end{align*}