Problems

Filters
Clear Filters

22 problems found

2024 Paper 2 Q12
D: 1500.0 B: 1500.0

In this question, you may use without proof the results \[ \sum_{i=1}^{n} i^2 = \tfrac{1}{6}n(n+1)(2n+1) \quad \text{and} \quad \sum_{i=1}^{n} i^3 = \tfrac{1}{4}n^2(n+1)^2. \] Throughout the question, \(n\) and \(k\) are integers with \(n \geqslant 3\) and \(k \geqslant 2\).

  1. In a game, \(k\) players, including Ada, are each given a random whole number from \(1\) to \(n\) (that is, for each player, each of these numbers is equally likely and assigned independently of all the others). A player wins the game if they are given a smaller number than all the other players, so there may be no winner in this game. Find an expression, in terms of \(n\), \(k\) and \(a\), for the probability that Ada is given number \(a\), where \(1 \leqslant a \leqslant n-1\), and all the other players are given larger numbers. Hence show that, if \(k = 4\), the probability that there is a winner in this game is \[ \frac{(n-1)^2}{n^2}\,. \]
  2. In a second game, \(k\) players, including Ada and Bob, are each given a random whole number from \(1\) to \(n\). A player wins the game if they are given a smaller number than all the other players or if they are given a larger number than all the other players, so it is possible for there to be zero, one or two winners in this game. Find an expression, in terms of \(n\), \(k\) and \(d\), for the probability that Ada is given number \(a\) and Bob is given number \(a + d + 1\), where \(1 \leqslant d \leqslant n-2\) and \(1 \leqslant a \leqslant n - d - 1\), and all the other players are given numbers greater than \(a\) and less than \(a + d + 1\). Hence show that, if \(k = 4\), the probability that there are two winners in this game is \[ \frac{(n-2)(n-1)^2}{n^3}\,. \] If \(k = 4\), what is the minimum value of \(n\) for which there are more likely to be exactly two winners than exactly one winner in this game?


Solution:

  1. Suppose Ada is given \(a\), then she wins if the other \(k-1\) players all get a number between \(a+1\) and \(n\). Since each of these choices are independent, this occurs with probability: \begin{align*} && \mathbb{P}(\text{Ada wins with }a) &= \left ( \frac{n-a}{n} \right)^{k-1} \\ \\ && \mathbb{P}(\text{Ada wins}) &= \sum_{a=1}^{n-1} \mathbb{P}(\text{Ada wins with }a) \mathbb{P}(\text{Ada has }a) \\ &&&= \sum_{a=1}^{n-1}\frac{1}{n}\left ( \frac{n-a}{n} \right)^{3}\\ &&&= \frac{1}{n^4} \sum_{a=1}^{n-1} (n-a)^3 \\ &&&= \frac{1}{n^4} \sum_{a=1}^{n-1} a^3 \\ &&&= \frac{1}{n^4} \tfrac14(n-1)^2n^2 \\ &&&= \frac{(n-1)^2}{4n^2} \end{align*} Since each the game is symmetric, each player is equally likely to win, therefore the probability anyone wins is \(\displaystyle \frac{(n-1)^2}{n^2}\)
  2. The probability that Ada gets \(a\), Bob gets \(a+d+1\) and the other players are in between is \begin{align*} && \mathbb{P}(\text{event}) &= \mathbb{P}(\text{Ada gets }a) \mathbb{P}(\text{Bob gets }a+d+1) \mathbb{P}(\text{everyone else between}) \\ &&&= \frac1{n^2} \cdot \left ( \frac{d}{n} \right) ^{k-2} \end{align*} Therefore the probability that Ada and Bob jointly win is \begin{align*} && \mathbb{P}(\text{Ada and Bob win}) &= \sum_{d=1}^{n-2} \sum_{a=1}^{n-d-1} \frac{1}{n^4} d^2 \\ &&&= \frac{1}{n^4} \sum_{d=1}^{n-2} (n-1-d) d^2 \\ &&&= \frac{n-1}{n^4} \frac{(n-2)(n-1)(2n-3)}{6} - \frac{1}{n^4} \frac{(n-2)^2(n-1)^2}{4} \\ &&&= \frac{(n-1)^2(n-2)}{12n^4} \left ( 2(2n-3)-3(n-2) \right) \\ &&&= \frac{(n-1)^2(n-2)}{12n^3} \\ \end{align*} There are \(4\) players so there are \(4\) ways to choose the lowest player and \(3\) remaining ways to choose the highest, so we get \(\displaystyle \frac{(n-2)(n-1)^2}{n^3}\) probability of a winner happening. The probability of there being a highest winner is the same as the probability of there being a lowest winner (both \(\frac{(n-1)^2}{n^2}\)) and the probability of there being exactly one winner is therefore \begin{align*} && P_1 &= P_{\geq1}-P_2+P_{\geq1}-P_2 \\ \end{align*} this is less than \(P_2\) iff \begin{align*} && P_{\geq1}+P_{\geq1}-2P_2 &< P_2 \\ \Leftrightarrow && 2P_{\geq 1} & < 3P_2 \\ \Leftrightarrow && \frac{2(n-1)^2}{n^2} &< \frac{3(n-1)^2(n-2)}{n^3} \\ \Leftrightarrow && 2n&<3(n-2) \\ \Leftrightarrow && 6 &< n \end{align*} So \(n = 7\)

2021 Paper 3 Q11
D: 1500.0 B: 1500.0

The continuous random variable \(X\) has probability density function \[ f(x) = \begin{cases} \lambda e^{-\lambda x} & \text{for } x \geqslant 0, \\ 0 & \text{otherwise,} \end{cases} \] where \(\lambda\) is a positive constant. The random variable \(Y\) is the greatest integer less than or equal to \(X\), and \(Z = X - Y\).

  1. Show that, for any non-negative integer \(n\), \[ \mathrm{P}(Y = n) = (1 - e^{-\lambda})\,e^{-n\lambda}. \]
  2. Show that \[ \mathrm{P}(Z < z) = \frac{1 - e^{-\lambda z}}{1 - e^{-\lambda}} \qquad \text{for } 0 \leqslant z \leqslant 1. \]
  3. Evaluate \(\mathrm{E}(Z)\).
  4. Obtain an expression for \[ \mathrm{P}(Y = n \text{ and } z_1 < Z < z_2), \] where \(0 \leqslant z_1 < z_2 \leqslant 1\) and \(n\) is a non-negative integer. Determine whether \(Y\) and \(Z\) are independent.


Solution:

  1. \(\,\) \begin{align*} && \mathbb{P}(Y = n) &= \mathbb{P}(X \in [n, n+1)) \\ &&&= \int_n^{n+1} \lambda e^{-\lambda x} \d x \\ &&&= \left [-e^{-\lambda x} \right]_n^{n+1} \\ &&&= e^{-\lambda n} - e^{-\lambda(n+1)} \\ &&&= e^{-\lambda n}(1- e^{-\lambda}) \end{align*}
  2. \(,\) \begin{align*} && \mathbb{P}(Z < z) &= \sum_{i=0}^{\infty} \mathbb{P}(X \in (n, n+z)) \\ &&&= \sum_{i=0}^{\infty} \int_{n}^{n+z} \lambda e^{-\lambda x} \d x \\ &&&= \sum_{i=0}^{\infty} [-e^{-\lambda x}]_{n}^{n+z} \\ &&&= \sum_{i=0}^{\infty} (1-e^{-\lambda x})e^{-\lambda n} \\ &&&= \frac{1-e^{-\lambda x}}{1-e^{-\lambda}} \end{align*}
  3. Give the cdf of \(Z\), we see that \(f_Z(z) = \frac{\lambda e^{-\lambda z}}{1-e^{-\lambda}}\) so \begin{align*} && \E[Z] &= \int_0^1 z \frac{\lambda e^{-\lambda z}}{1-e^{-\lambda}} \d z \\ &&&= \frac{\lambda}{1-e^{-\lambda}} \int_0^1 ze^{-\lambda z} \d z \\ &&&= \frac{\lambda}{1-e^{-\lambda}} \left ( \left [-\frac{1}{\lambda} ze^{-\lambda z} \right]_0^1+\int_0^1 \frac{1}{\lambda} e^{-\lambda z} \d z \right) \\ &&&= \frac{\lambda}{1-e^{-\lambda}} \left ( -\frac{e^{-\lambda}}{\lambda} + \frac{1-e^{-\lambda}}{\lambda^2} \right) \\ &&&= \frac{1-e^{-\lambda}(1+\lambda)}{\lambda (1-e^{-\lambda})} \end{align*}
  4. \(\,\) \begin{align*} && \mathbb{P}(Y = n \text{ and }z_1 < Z < z_2)&= \mathbb{P}(X \in (n+z_1, n+z_2) ) \\ &&&= \int_{n+z_1}^{n+z_2} \lambda e^{-\lambda x} \d x \\ &&&= e^{-n\lambda}(e^{-\lambda z_1} - e^{-\lambda z_2}) \end{align*} Note that \(\mathbb{P}(z_1 < Z < z_2) = \mathbb{P}( Z < z_2) -\mathbb{P}(Z< z_1) =\frac{e^{-\lambda z_1} - e^{-\lambda z_2}}{1-e^{-\lambda}}\) Therefore \begin{align*} && \mathbb{P}(Y = n \text{ and }z_1 < Z < z_2) &= e^{-n\lambda}(e^{-\lambda z_1} - e^{-\lambda z_2}) \\ &&&= e^{-\lambda n}(1-e^{-\lambda}) \frac{e^{-\lambda z_1} - e^{-\lambda z_2}}{1-e^{-\lambda}} \\ &&&= \mathbb{P}(Y=n) \mathbb{P}(z_1 < Z < z_2) \end{align*} So they are independent, which is to be expected from the memorylessness property of the exponential distribution.

2021 Paper 3 Q12
D: 1500.0 B: 1500.0

  1. In a game, each member of a team of \(n\) players rolls a fair six-sided die. The total score of the team is the number of pairs of players rolling the same number. For example, if \(7\) players roll \(3, 3, 3, 3, 6, 6, 2\) the total score is \(7\), as six different pairs of players both score \(3\) and one pair of players both score \(6\). Let \(X_{ij}\), for \(1 \leqslant i < j \leqslant n\), be the random variable that takes the value \(1\) if players \(i\) and \(j\) roll the same number and the value \(0\) otherwise. Show that \(X_{12}\) is independent of \(X_{23}\). Hence find the mean and variance of the team's total score.
  2. Show that, if \(Y_i\), for \(1 \leqslant i \leqslant m\), are random variables with mean zero, then \[ \mathrm{Var}(Y_1 + Y_2 + \cdots + Y_m) = \sum_{i=1}^{m} \mathrm{E}(Y_i^2) + 2\sum_{i=1}^{m-1}\sum_{j=i+1}^{m} \mathrm{E}(Y_i Y_j). \]
  3. In a different game, each member of a team of \(n\) players rolls a fair six-sided die. The total score of the team is the number of pairs of players rolling the same even number minus the number of pairs of players rolling the same odd number. For example, if \(7\) players roll \(3, 3, 3, 3, 6, 6, 2\) the total score is \(-5\). Let \(Z_{ij}\), for \(1 \leqslant i < j \leqslant n\), be the random variable that takes the value \(1\) if players \(i\) and \(j\) roll the same even number, the value \(-1\) if players \(i\) and \(j\) roll the same odd number and the value \(0\) otherwise. Show that \(Z_{12}\) is not independent of \(Z_{23}\). Find the mean of the team's total score and show that the variance of the team's total score is \(\dfrac{1}{36}n(n^2 - 1)\).


Solution:

  1. First note that \(\mathbb{P}(X_{ij} = 1) = \frac16\) since it doesn't matter what \(i\) rolls, it only matters that \(j\) rolls the same thing, which happens \(1/6\) of the time. \begin{align*} && \mathbb{P}(X_{12} = 1, X_{23} = 1) &= \mathbb{P}(1, 2\text{ and }3\text{ all roll the same})\\ &&&= \frac{6}{6^3}= \frac1{6^2} \\ &&&= \mathbb{P}(X_{12} = 1)\mathbb{P}(X_{23} = 1) \\ && \mathbb{P}(X_{12} = 1, X_{23} = 0) &= \mathbb{P}(1, 2\text{ roll the same and }3\text{ rolls different}) \\ &&&= \frac{6 \cdot 1 \cdot 5}{6^3} = \frac{5}{6^2} \\ &&&= \mathbb{P}(X_{12} = 1)\mathbb{P}(X_{23} = 0) \\ && \mathbb{P}(X_{12} = 0, X_{23} = 0) &= \mathbb{P}(2, 3 \text{ roll different to} 2)\\ &&&= \frac{6 \cdot 5 \cdot 5}{6^3}= \frac{5^2}{6^2} \\ &&&= \mathbb{P}(X_{12} = 0)\mathbb{P}(X_{23} = 0) \end{align*} Therefore they are independent (the final case is clear by symmetry from case 2). Note that the score is \(S = \sum_{i \neq j} X_{ij}\) so \begin{align*} && \E[S] &= \E \left [ \sum_{i \neq j} X_{ij} \right] \\ &&&= \sum_{i \neq j} \E \left [ X_{ij} \right] \\ &&&= \sum_{i \neq j} \frac16 \\ &&&= \binom{n}{2} \frac16 = \frac{n(n-1)}{12} \\ \\ && \var[S] &= \var \left [ \sum_{i \neq j} X_{ij} \right] \\ &&& \sum_{i \neq j} \var \left [X_{ij} \right] \tag{pairwise ind.} \\ &&&= \binom{n}{2} \frac{5}{36} = \frac{5n(n-1)}{72} \end{align*}
  2. Note that \(\mathbb{P}(Z_{ij} = 1)=\mathbb{P}(Z_{ij} = -1) = \frac{3}{6^2} = \frac{1}{12}\) but that \(\mathbb{P}(Z_{12} = 1, Z_{23} = -1) = 0\). Notice that \(Z_{12}Z_{23}\) is either \(1\) or \(0\) (since \(2\) can't be both odd and even). \(\mathbb{P}(Z_{12}Z_{23} = 1) = \frac{1}{36}\). Notice that \(Z_{ij}, Z_{kl}\) are independent if \(i \neq j \neq k \neq l\) and so \begin{align*} && \E[T] &= \E \left [ \sum_{i \neq j} Z_{ij} \right] \\ &&&= \sum_{i \neq j}\E \left [ Z_{ij} \right] \\ &&&= 0 \\ \\ && \E[T^2] &= \E \left [ \left ( \sum_{i \neq j} Z_{ij} \right)^2 \right] \\ &&&= \E \left [ \sum_{i \neq j} Z_{ij}^2 + \sum_{i \neq j \neq k} Z_{ij}Z_{jk} + \sum_{i \neq j \neq k \neq l} Z_{ij}Z_{kl}\right] \\ &&&= \binom{n}{2} \frac{1}{6} + 2\frac{n(n-1)(n-2)}{2} \frac{1}{36} + 0 \\ &&&= \frac{n(n-1)}{12} + \frac{n(n-1)(n-2)}{6} \\ &&&= \frac{n(n-1)[3 + (n-2)]}{36} \\ &&&= \frac{n(n^2-1)}{36} \end{align*}

2020 Paper 3 Q12
D: 1500.0 B: 1500.0

\(A\) and \(B\) both toss the same biased coin. The probability that the coin shows heads is \(p\), where \(0 < p < 1\), and the probability that it shows tails is \(q = 1 - p\). Let \(X\) be the number of times \(A\) tosses the coin until it shows heads. Let \(Y\) be the number of times \(B\) tosses the coin until it shows heads.

  1. The random variable \(S\) is defined by \(S = X + Y\) and the random variable \(T\) is the maximum of \(X\) and \(Y\). Find an expression for \(\mathrm{P}(S = s)\) and show that \[ \mathrm{P}(T = t) = pq^{t-1}(2 - q^{t-1} - q^t). \]
  2. The random variable \(U\) is defined by \(U = |X - Y|\), and the random variable \(W\) is the minimum of \(X\) and \(Y\). Find expressions for \(\mathrm{P}(U = u)\) and \(\mathrm{P}(W = w)\).
  3. Show that \(\mathrm{P}(S = 2 \text{ and } T = 3) \neq \mathrm{P}(S = 2) \times \mathrm{P}(T = 3)\).
  4. Show that \(U\) and \(W\) are independent, and show that no other pair of the four variables \(S\), \(T\), \(U\) and \(W\) are independent.

2017 Paper 3 Q12
D: 1700.0 B: 1500.2

The discrete random variables \(X\) and \(Y\) can each take the values \(1\), \(\ldots\,\), \(n\) (where \(n\ge2\)). Their joint probability distribution is given by \[ \P(X=x, \ Y=y) = k(x+y) \,, \] where \(k\) is a constant.

  1. Show that \[ \P(X=x) = \dfrac{n+1+2x}{2n(n+1)}\,. \] Hence determine whether \(X\) and \(Y\) are independent.
  2. Show that the covariance of \(X\) and \(Y\) is negative.


Solution:

  1. \(\,\) \begin{align*} && \mathbb{P}(X = x) &= \sum_{y=1}^n \mathbb{P}(X=x,Y=y) \\ &&&= \sum_{y=1}^n k(x+y) \\ &&&= nkx + k\frac{n(n+1)}2 \\ \\ && 1 &= \sum_{x=1}^n \mathbb{P}(X=x) \\ &&&= nk\frac{n(n+1)}{2} + kn\frac{n(n+1)}2 \\ &&&= kn^2(n+1) \\ \Rightarrow && k &= \frac{1}{n^2(n+1)} \\ \Rightarrow && \mathbb{P}(X = x) &= \frac{nx}{n^2(n+1)} + \frac{n(n+1)}{2n^2(n+1)} \\ &&&= \frac{n+1+2x}{2n(n+1)} \\ \\ && \mathbb{P}(X=x)\mathbb{P}(Y=y) &= \frac{(n+1)^2+2(n+1)(x+y)+4xy}{4n^2(n+1)^2} \\ &&&\neq \frac{x+y}{n^2(n+1)} \end{align*} Therefore \(X\) and \(Y\) are not independent.
  2. \(\,\) \begin{align*} && \E[X] &= \sum_{x=1}^n x \mathbb{P}(X=x) \\ &&&= \sum_{x=1}^n x \mathbb{P}(X=x)\\ &&&= \sum_{x=1}^n x \frac{n+1+2x}{2n(n+1)} \\ &&&= \frac{1}{2n(n+1)} \left ( (n+1) \sum x + 2\sum x^2\right)\\ &&&= \frac{1}{2n(n+1)} \left ( \frac{n(n+1)^2}{2} + \frac{n(n+1)(2n+1)}{3} \right) \\ &&&= \frac{1}{2} \left ( \frac{n+1}{2} + \frac{2n+1}{3} \right)\\ &&&= \frac{1}{2} \left ( \frac{7n+5}{6} \right)\\ &&&= \frac{7n+5}{12} \\ \\ && \textrm{Cov}(X,Y) &= \mathbb{E}\left[XY\right] - \E[X] \E[Y] \\ &&&= \sum_{x=1}^n \sum_{y=1}^n xy \frac{x+y}{n^2(n+1)} - \E[X]^2 \\ &&&= \frac{1}{n^2(n+1)} \sum \sum (x^2 y+xy^2) - \E[X]^2 \\ &&&= \frac{1}{n^2(n+1)} \left (\sum y \right )\left (\sum x^2\right ) - \E[X]^2 \\ &&&=\frac{(n+1)(2n+1)}{12} - \left ( \frac{7n+5}{12}\right)^2 \\ &&&= \frac1{144} \left (12(2n^2+3n+1) - (49n^2+70n+25) \right)\\ &&&= \frac{1}{144} \left (-25n^2-34n-13 \right) \\ &&& < 0 \end{align*} since \(\Delta = 34^2 - 4 \cdot 25 \cdot 13 = 4(17^2-25 \times 13) = -4 \cdot 36 < 0\)

2016 Paper 1 Q13
D: 1500.0 B: 1500.0

An internet tester sends \(n\) e-mails simultaneously at time \(t=0\). Their arrival times at their destinations are independent random variables each having probability density function \(\lambda \e^{-\lambda t}\) (\(0\le t<\infty\), \( \lambda >0\)).

  1. The random variable \(T\) is the time of arrival of the e-mail that arrives first at its destination. Show that the probability density function of \(T\) is \[ n \lambda \e^{-n\lambda t}\,,\] and find the expected value of \(T\).
  2. Write down the probability that the second e-mail to arrive at its destination arrives later than time \(t\) and hence derive the density function for the time of arrival of the second e-mail. Show that the expected time of arrival of the second e-mail is \[ \frac{1}{\lambda} \left( \frac1{n-1} + \frac 1 n \right) \]


Solution:

  1. \(\,\) \begin{align*} && \mathbb{P}(T > t) &= \mathbb{P}(\text{all emails slower than }t) \\ &&&= \left ( \int_t^{\infty} \lambda e^{-\lambda x} \d x \right)^n \\ &&&= \left ( [- e^{-\lambda x}]_t^\infty\right)^n\\ &&&= e^{-n\lambda t} \\ \Rightarrow && f_T(t) &= n \lambda e^{-n\lambda t} \\ \end{align*} Therefore \(T \sim \text{Exp}(n \lambda)\) and \(\E[T] = \frac{1}{n \lambda}\)
  2. Let \(T_2\) be the time until the second email arrives, then. \begin{align*} && \P(T_2 > t) &= \P(\text{all emails} > t) + \P(\text{all but 1 emails} > t) \\ &&&= e^{-n\lambda t} + n \cdot e^{-(n-1)\lambda t}(1-e^{-\lambda t}) \\ &&&= (1-n)e^{-n\lambda t} + n \cdot e^{-(n-1)\lambda t} \\ \Rightarrow && f_{T_2}(t) &= - \left ( (1-n) n \lambda e^{-n \lambda t} -n(n-1)\lambda e^{-(n-1)\lambda t} \right) \\ &&&= n(n-1) \lambda \left (e^{-(n-1)\lambda t} - e^{-n\lambda t} \right) \\ \Rightarrow && \E[T_2] &= \int_0^{\infty} t \cdot n(n-1) \lambda \left (e^{-(n-1)\lambda t} - e^{-n\lambda t} \right) \d t \\ &&&= \int_0^{\infty} \left (n \cdot t (n-1) \lambda e^{-(n-1)\lambda t} -(n-1)\cdot tn \lambda e^{-n\lambda t} \right) \d t \\ &&&= \frac{n}{\lambda(n-1)} - \frac{n-1}{\lambda n} \\ &&&= \frac{1}{\lambda} \left (1+\frac{1}{n-1}- \left (1 - \frac{1}{n} \right) \right) \\ &&&= \frac{1}{\lambda} \left ( \frac{1}{n-1} + \frac{1}{n} \right) \end{align*} (We can also view this second expectation as expected time for first email + expected time (of the remaining \(n-1\) emails) for the first email, and we can see that will have that form by the memorilessness property of exponentials)

2013 Paper 2 Q12
D: 1600.0 B: 1484.0

The random variable \(U\) has a Poisson distribution with parameter \(\lambda\). The random variables \(X\) and \(Y\) are defined as follows. \begin{align*} X&= \begin{cases} U & \text{ if \(U\) is 1, 3, 5, 7, \(\ldots\,\)} \\ 0 & \text{ otherwise} \end{cases} \\ Y&= \begin{cases} U & \text{ if \(U\) is 2, 4, 6, 8, \(\ldots\,\) } \\ 0 & \text{ otherwise} \end{cases} \end{align*}

  1. Find \(\E(X)\) and \(\E(Y)\) in terms of \(\lambda\), \(\alpha\) and \(\beta\), where \[ \alpha = 1+\frac{\lambda^2}{2!}+\frac{\lambda^4}{4!} +\cdots\, \text{ \ \ and \ \ } \beta = \frac{\lambda}{1!} + \frac{\lambda^3}{3!} + \frac{\lambda^5}{5!} +\cdots\,. \]
  2. Show that \[ \var(X) = \frac{\lambda\alpha+\lambda^2\beta}{\alpha+\beta} - \frac{\lambda^2\alpha^2}{(\alpha+\beta)^2} \] and obtain the corresponding expression for \(\var(Y)\). Are there any non-zero values of \(\lambda\) for which \( \var(X) + \var(Y) = \var(X+Y)\,\)?


Solution:

  1. \begin{align*} \mathbb{E}(X) &= \sum_{r=1}^\infty r \mathbb{P}(X = r) \\ &= \sum_{j=1}^{\infty} (2j-1)\mathbb{P}(U=2j-1) \\ &= \sum_{j=1}^{\infty}(2j-1) \frac{e^{-\lambda} \lambda^{2j-1}}{(2j-1)!} \\ &= \sum_{j=1}^{\infty} e^{-\lambda} \frac{\lambda^{2j-1}}{(2j-2)!} \\ &= \lambda e^{-\lambda} \sum_{j=1}^{\infty} \frac{\lambda^{2j-2}}{(2j-2)!} \\ &= \lambda e^{-\lambda} \alpha \end{align*} Since \(\mathbb{E}(X+Y) = \lambda, \mathbb{E}(Y) = \lambda(1-e^{-\lambda}\alpha) = \lambda(e^{-\lambda}(\alpha+\beta) - e^{-\lambda}\alpha) = \lambda e^{-\lambda} \beta\). Alternatively, as \(\beta + \alpha = e^{\lambda}\), \(\mathbb{E}(X) = \frac{\lambda \alpha}{\alpha+\beta}, \mathbb{E}(Y) = \frac{\lambda \beta}{\alpha+\beta}\)
  2. \begin{align*} \textrm{Var}(X) &= \mathbb{E}(X^2) - [\mathbb{E}(X) ]^2 \\ &= \sum_{odd} r^2 \mathbb{P}(U = r) - \left [ \mathbb{E}(X) \right]^2 \\ &= \sum_{odd} (r(r-1)+r)\frac{e^{-\lambda}\lambda^r}{r!} - \frac{\lambda^2 \alpha^2}{(\alpha+\beta)^2} \\ &= \sum_{odd} \frac{e^{-\lambda}\lambda^r}{(r-2)!}+\sum_{odd} \frac{e^{-\lambda}\lambda^r}{(r-1)!} - \frac{\lambda^2 \alpha^2}{(\alpha+\beta)^2} \\ &= e^{-\lambda}\lambda^2 \beta + e^{-\lambda}\lambda \alpha - \frac{\lambda^2 \alpha^2}{(\alpha+\beta)^2} \\ &= \frac{\lambda \alpha + \lambda^2 \beta}{\alpha+\beta}- \frac{\lambda^2 \alpha^2}{(\alpha+\beta)^2} \end{align*} Similarly, \begin{align*} \textrm{Var}(Y) &= \mathbb{E}(Y^2) - [\mathbb{E}(Y) ]^2 \\ &= \sum_{even} r^2 \mathbb{P}(U = r) - \left [ \mathbb{E}(Y) \right]^2 \\ &= \sum_{even} (r(r-1)+r)\frac{e^{-\lambda}\lambda^r}{r!} - \frac{\lambda^2 \beta^2}{(\alpha+\beta)^2} \\ &= e^{-\lambda}\lambda^2\alpha + e^{-\lambda}\lambda \beta - \frac{\lambda^2 \beta^2}{(\alpha+\beta)^2} \\ &= \frac{\lambda \beta + \lambda^2 \alpha}{\alpha+\beta}- \frac{\lambda^2 \beta^2}{(\alpha+\beta)^2} \end{align*} Since \(\textrm{Var}(X+Y) = \textrm{Var}(U) = \lambda\), we are interested in solving: \begin{align*} \lambda &= \frac{\lambda \alpha + \lambda^2 \beta}{\alpha+\beta}- \frac{\lambda^2 \alpha^2}{(\alpha+\beta)^2} + \frac{\lambda \beta + \lambda^2 \alpha}{\alpha+\beta}- \frac{\lambda^2 \beta^2}{(\alpha+\beta)^2} \\ &= \frac{\lambda(\alpha+\beta) + \lambda^2(\alpha+\beta)}{\alpha+\beta} - \frac{\lambda^2(\alpha^2+\beta^2)}{(\alpha+\beta)^2} \\ &= \lambda + \lambda^2 \frac{(\alpha+\beta)^2 - (\alpha^2+\beta^2)}{(\alpha+\beta)^2} \\ &= \lambda + \lambda^2 \frac{2\alpha\beta}{(\alpha+\beta)^2} \end{align*} which is clearly not possible if \(\lambda \neq 0\)

2009 Paper 3 Q13
D: 1700.0 B: 1488.4

  1. The point \(P\) lies on the circumference of a circle of unit radius and centre \(O\). The angle, \(\theta\), between \(OP\) and the positive \(x\)-axis is a random variable, uniformly distributed on the interval \(0\le\theta<2\pi\). The cartesian coordinates of \(P\) with respect to \(O\) are \((X,Y)\). Find the probability density function for \(X\), and calculate \(\var (X)\). Show that \(X\) and \(Y\) are uncorrelated and discuss briefly whether they are independent.
  2. The points \(P_i\) (\(i=1\), \(2\), \(\ldots\) , \(n\)) are chosen independently on the circumference of the circle, as in part (i), and have cartesian coordinates \((X_i, Y_i)\). The point \(\overline P\) has coordinates \((\overline X, \overline Y)\), where \(\overline X =\dfrac1n \sum\limits _{i=1}^n X_i\) and \(\overline Y =\dfrac1n \sum\limits _{i=1}^n Y_i\). Show that \(\overline X\) and \(\overline Y\) are uncorrelated. Show that, for large \(n\), \(\displaystyle \P\left(\vert \overline X \vert \le \sqrt{\frac2n}\right)\approx 0.95\,\).


Solution:

  1. \(X = \cos \theta\) \(\theta \sim U(0, 2\pi)\). Noting that \(\mathbb{P}(X \geq t ) = \frac{2}{2\pi}\cos^{-1} t\) so \(f_X(t) = \frac{1}{\pi} \frac{1}{\sqrt{1-x^2}}\) \begin{align*} && \E[X] &= 0 \tag{by symmetry} \\ && \E[X^2] &= \int_0^{2\pi} \cos^2 \theta \frac{1}{2 \pi} \d \theta \\ &&&= \frac{1}{2} \cdot 2\pi \cdot \frac{1}{2\pi} \\ &&&= \frac12 \\ \Rightarrow & &\var[X] &= \frac12 \\ \\ && \E[XY] &= \int_0^{2\pi} \cos \theta \sin \theta \frac{1}{2 \pi} \d \theta \\ &&&= \frac{1}{4\pi} \int_0^{2\pi} \sin 2\theta \d \theta \\ &&& =0 = \E[X]\E[Y] \end{align*} But note that clearly \(X\) and \(Y\) are not independent, since given \(X\) there are only two possible values of \(Y\).
  2. \(\,\) \begin{align*} && \E \left [ XY \right] &= \E \left [ \left ( \frac1n \sum_{i=1}^n X_i \right)\left ( \frac1n \sum_{i=1}^n Y_i\right) \right] \\ &&&= \frac{1}{n^2} \sum_{i=1}^n \sum_{j=1}^n \E [X_i Y_j] \\ &&&= 0 = \E[X] \E[Y] \end{align*} Therefore \(X\) and \(Y\) are uncorrelated. Note that \(\E[X_i] = 0, \var[X_i] = \frac12\) so we can apply the central limit theorem to see that \(X \approx N(0, \frac{1}{2n})\), in particular \begin{align*} && 0.95 &\approx \mathbb{P}(|Z| < 2) \\ &&&= \mathbb{P} \left ( \Big |\frac{X}{\sqrt{\frac{1}{2n}}} \Big | < 2 \right ) \\ &&&= \mathbb{P}\left (|X| < \sqrt{\frac{2}{n}} \right) \end{align*}

2006 Paper 1 Q12
D: 1500.0 B: 1499.3

Oxtown and Camville are connected by three roads, which are at risk of being blocked by flooding. On two of the three roads there are two sections which may be blocked. On the third road there is only one section which may be blocked. The probability that each section is blocked is \(p\). Each section is blocked independently of the other four sections. Show that the probability that Oxtown is cut off from Camville is \(p^3 \l 2-p \r^2\). I want to travel from Oxtown to Camville. I choose one of the three roads at random and find that my road is not blocked. Find the probability that I would not have reached Camville if I had chosen either of the other two roads. You should factorise your answer as fully as possible. Comment briefly on the value of this probability in the limit \(p\to1\).

2006 Paper 3 Q14
D: 1700.0 B: 1516.0

For any random variables \(X_1\) and \(X_2\), state the relationship between \(\E(aX_1+bX_2)\) and \(\E(X_1)\) and \(\E(X_2)\), where \(a\) and \(b\) are constants. If \(X_1\) and \(X_2\) are independent, state the relationship between \(\E(X_1X_2)\) and \(\E(X_1)\) and \(\E(X_2)\). An industrial process produces rectangular plates. The length and the breadth of the plates are modelled by independent random variables \(X_1\) and \(X_2\) with non-zero means \(\mu_1\) and \(\mu_2\) and non-zero standard deviations \(\sigma_1\) and \(\sigma_2\), respectively. Using the results in the paragraph above, and without quoting a formula for \(\var(aX_1+bX_2)\), find the means and standard deviations of the perimeter \(P\) and area \(A\) of the plates. Show that \(P\) and \(A\) are not independent. The random variable \(Z\) is defined by \(Z=P-\alpha A\), where \(\alpha \) is a constant. Show that \(Z\) and \(A\) are not independent if \[ \alpha \ne \dfrac{2(\mu_1^{\vphantom2} \sigma_2^2 +\mu_2^{\vphantom2}\sigma_1^2)} { \mu_1^2 \sigma_2^2 +\mu_2^2\sigma_1^2 + \sigma_1^2\sigma_2^2 } \;. \] Given that \(X_1\) and \(X_2\) can each take values 1 and 3 only, and that they each take these values with probability \(\frac 12\), show that \(Z\) and \(A\) are not independent for any value of \(\alpha\).


Solution: \(\E(aX_1+bX_2) = a \E(X_1) + b\E(X_2)\) for any \(X_1, X_2\) \(\E(X_1X_2)=\E(X_1)\E(X_2)\). if \(X_1, X_2\) are independent. \begin{align*} && \E(P) &= \E(2(X_1+X_2)) = 2(\E[X_1]+\E[X_2]) \\ &&&= 2(\mu_1 + \mu_2) \\ && \var(P) &= \E[\left ( 2(X_1+X_2) \right)^2] - \E[2(X_1+X_2)]^2 \\ &&&= 4\E[X_1^2+2X_1X_2+X_2^2] -4(\mu_1 + \mu_2)^2 \\ &&&= 4(\mu_1^2 + \sigma_1^2 + 2\mu_1\mu_2 + \mu_2^2 + \sigma_2^2) - 4(\mu_1 + \mu_2)^2 \\ &&&= 4(\sigma_1^2+\sigma_2^2) \\ && \textrm{SD}(P) &= 2 \sqrt{\sigma_1^2+\sigma_2^2}\\ \\ && \E(A) &= \E[X_1X_2] = \E[X_1]\E[X_2] \\ &&&= \mu_1\mu_2 \\ && \var(A) &= \E[(X_1X_2)^2] - (\mu_1\mu_2)^2 \\ &&&= (\mu_1^2+\sigma_1^2)(\mu_2^2+\sigma_2^2) - (\mu_1\mu_2)^2\\ &&&= \mu_1^2 \sigma_2^2 + \mu_2^2 \sigma_1^2 + \sigma_1^2 \sigma_2^2\\ && \textrm{SD}(A) &= \sqrt{\mu_1^2 \sigma_2^2 + \mu_2^2 \sigma_1^2 + \sigma_1^2 \sigma_2^2} \end{align*} \begin{align*} \E[PA] &= \E[2(X_1+X_2)X_1X_2] \\ &= 2\E[X_1^2X_2] + 2\E[X_1X_2^2]\\ &= 2(\mu_1^2 + \sigma_1^2)\mu_2 + 2\mu_1 (\mu_2^2+\sigma_2^2)\\ &\neq 2(\mu_1 + \mu_2)\mu_1\mu_2 \\ &= \E[P]\E[A] \end{align*} \begin{align*} && \E[Z] &= \E[P] - \alpha \E[A] \\ &&&= 2(\mu_1+\mu_2) - \alpha \mu_1 \mu_2 \\ \\ && \E[ZA] &= \E[PA - \alpha A^2] \\ &&&= 2(\mu_1^2 + \sigma_1^2)\mu_2 + 2\mu_1 (\mu_2^2+\sigma_2^2) - \alpha \E[A^2] \\ &&&= 2(\mu_1^2 + \sigma_1^2)\mu_2 + 2\mu_1 (\mu_2^2+\sigma_2^2) - \alpha \E[X_1^2]\E[X_2^2] \\ &&&= 2(\mu_1^2 + \sigma_1^2)\mu_2 + 2\mu_1 (\mu_2^2+\sigma_2^2) - \alpha (\mu_1^2+\sigma_1^2)(\mu_2^2+\sigma_2^2) \\ \text{if ind.} && \E[Z]\E[A] &= \E[ZA]\\ && (2(\mu_1+\mu_2) - \alpha \mu_1 \mu_2) \mu_1\mu_2 &= 2(\mu_1^2 + \sigma_1^2)\mu_2 + 2\mu_1 (\mu_2^2+\sigma_2^2) - \alpha (\mu_1^2+\sigma_1^2)(\mu_2^2+\sigma_2^2) \\ \Rightarrow && 2(\mu_1^2\mu_2+\mu_1\mu_2^2) - \alpha \mu_1^2\mu_2^2 &= 2(\mu_1^2\mu_2+\mu_1\mu_2^2) + 2\sigma_1^2\mu_2 + 2\sigma_2^2\mu_1 - \alpha (\mu_1^2+\sigma_1^2)(\mu_2^2+\sigma_2^2) \\ \Rightarrow && \alpha ((\mu_1^2+\sigma_1^2)(\mu_2^2+\sigma_2^2) - \mu_1^2\mu_2^2) &= 2(\sigma_1^2\mu_2 + \sigma_2^2\mu_1) \\ \Rightarrow && \alpha &= \frac{ 2(\sigma_1^2\mu_2 + \sigma_2^2\mu_1) }{\mu_1^2 \sigma_2^2 + \mu_2^2 \sigma_1^2 + \sigma_1^2 \sigma_2^2} \end{align*} Therefore if they are not independent if \(\alpha \neq \) the expression. \begin{array}{c|c|c|c|c|c} & X_1 & X_2 & A & P & Z \\ \hline 0.25 & 1 & 1 & 1 & 4 & 4-\alpha \\ 0.25 & 1 & 3 & 3 & 8 & 8-3\alpha \\ 0.25 & 3 & 1 & 3 & 8 & 8-3\alpha \\ 0.25 & 3 & 3 & 9 & 12 & 12-9\alpha \\ \end{array} If \(\mathbb{P}(A = 1, Z = 4-\alpha) = \mathbb{P}(A = 1)\mathbb{P}(Z = 4-\alpha)\) then \(\mathbb{P}(Z = 4-\alpha) = 1\), but that mean \(4-\alpha = 8-3\alpha = 12-9\alpha\) which is not a consistent set of equations as the first two are solved by \(\alpha = 2\) and the second by \(\alpha = \frac23\)

2005 Paper 2 Q12
D: 1600.0 B: 1500.0

The twins Anna and Bella share a computer and never sign their e-mails. When I e-mail them, only the twin currently online responds. The probability that it is Anna who is online is \(p\) and she answers each question I ask her truthfully with probability \(a\), independently of all her other answers, even if a question is repeated. The probability that it is Bella who is online is~\(q\), where \(q=1-p\), and she answers each question truthfully with probability \(b\), independently of all her other answers, even if a question is repeated.

  1. I send the twins the e-mail: `Toss a fair coin and answer the following question. Did the coin come down heads?'. I receive the answer `yes'. Show that the probability that the coin did come down heads is \(\frac{1}{2}\) if and only if \(2(ap+bq)=1\).
  2. I send the twins the e-mail: `Toss a fair coin and answer the following question. Did the coin come down heads?'. I receive the answer `yes'. I then send the e-mail: `Did the coin come down heads?' and I receive the answer `no'. Show that the probability (taking into account these answers) that the coin did come down heads is \(\frac{1}{2}\,\).
  3. I send the twins the e-mail: `Toss a fair coin and answer the following question. Did the coin come down heads?'. I receive the answer `yes'. I then send the e-mail: `Did the coin come down heads?' and I receive the answer `yes'. Show that, if \(2(ap+bq)=1\), the probability (taking into account these answers) that the coin did come down heads is \(\frac{1}{2}\,\).

2004 Paper 3 Q13
D: 1700.0 B: 1473.0

A men's endurance competition has an unlimited number of rounds. In each round, a competitor has, independently, a probability \(p\) of making it through the round; otherwise, he fails the round. Once a competitor fails a round, he drops out of the competition; before he drops out, he takes part in every round. The grand prize is awarded to any competitor who makes it through a round which all the other remaining competitors fail; if all the remaining competitors fail at the same round the grand prize is not awarded. If the competition begins with three competitors, find the probability that:

  1. all three drop out in the same round;
  2. two of them drop out in round \(r\) (with \(r \ge 2\)) and the third in an earlier round;
  3. the grand prize is awarded.


Solution:

  1. This is the same as the sum of the probability that they all drop out in the \(k\)th round for all values of \(k\), ie \begin{align*} \mathbb{P}(\text{all drop in same round}) &= \sum_{k=0}^\infty \mathbb{P}(\text{all drop out in the }k+1\text{th round}) \\ &= \sum_{k=0}^{\infty}(p^k(1-p))^3 \\ &= (1-p)^3 \sum_{k=0}^{\infty}p^{3k} \\ &= \frac{(1-p)^3}{1-p^3} \\ &= \frac{1+3p(1-p)(p-(1-p))-p^3}{1-p^3} \\ &= \frac{1-p^3-3p(1-p)(1-2p)}{1-p^3} \end{align*}
  2. There are \(3\) ways to choose the person who drops out earlier, and then they can drop out in round \(0, 1, \cdots r-1\) \begin{align*} \mathbb{P}(\text{exactly two drop out in round }r\text{ and one before}) &= 3\sum_{k=0}^{r-2} (p^{r-1}(1-p))^2p^k(1-p) \\ &= 3p^{2r-2}(1-p)^3 \sum_{k=0}^{r-2}p^k \\ &= 3p^{2r-2}(1-p)^3 \frac{1-p^{r-1}}{1-p} \\ &= 3p^{2r-2}(1-p)^2(1-p^{r-1}) \end{align*}
  3. The probability exactly \(2\) finish after the third \begin{align*} \mathbb{P}(\text{exactly two drop out after third}) &= \sum_{r=2}^{\infty}\mathbb{P}(\text{exactly two drop out in round }r\text{ and one before}) \\ &= \sum_{r=2}^{\infty}3p^{2r-2}(1-p)^2(1-p^{r-1}) \\ &= 3(1-p)^2p^{-2}\sum_{r=2}^{\infty}(p^{2r}-p^{3r-1}) \\ &= 3(1-p)^2p^{-2} \left( \frac{p^4}{1-p^2} - \frac{p^5}{1-p^3} \right) \\ &= \frac{3(1-p)^2(p^2(1-p^3)-p^3(1-p^2))}{(1-p^2)(1-p^3)}\\ &= \frac{3(1-p)^3p^2}{(1-p^2)(1-p^3)}\\ \end{align*} Therefore the probability the grand prize is not awarded is \begin{align*} P &= 1 - \frac{(1-p)^3}{1-p^3} - \frac{3(1-p)^3p^2}{(1-p^2)(1-p^3)} \\ &= \frac{(1-p^3)(1-p^2) - (1-p)^3(1-p^2)-3(1-p)^3p^2}{(1-p^2)(1-p^3)} \\ &= \frac{(1-p^3)(1-p^2) - (1-p)^3(1+2p^2)}{(1-p^2)(1-p^3)} \\ \end{align*}

2002 Paper 3 Q13
D: 1700.0 B: 1516.0

A continuous random variable is said to have an exponential distribution with parameter \(\lambda\) if its density function is \(\f(t) = \lambda \e ^{- \lambda t} \; \l 0 \le t < \infty \r\,\). If \(X_1\) and \(X_2\), which are independent random variables, have exponential distributions with parameters \(\lambda_1\) and \(\lambda_2\) respectively, find an expression for the probability that either \(X_1\) or \(X_2\) (or both) is less than \(x\). Prove that if \(X\) is the random variable whose value is the lesser of the values of \(X_1\) and \(X_2\), then \(X\) also has an exponential distribution. Route A and Route B buses run from my house to my college. The time between buses on each route has an exponential distribution and the mean time between buses is 15 minutes for Route A and 30 minutes for Route B. The timings of the buses on the two routes are independent. If I emerge from my house one day to see a Route A bus and a Route B bus just leaving the stop, show that the median wait for the next bus to my college will be approximately 7 minutes.

2000 Paper 1 Q14
D: 1484.0 B: 1528.4

The random variable \(X\) is uniformly distributed on the interval \([-1,1]\). Find \(\E(X^2)\) and \(\var (X^2)\). A second random variable \(Y\), independent of \(X\), is also uniformly distributed on \([-1,1]\), and \(Z=Y-X\). Find \(\E(Z^2)\) and show that \(\var (Z^2) = 7 \var (X^2)\).


Solution: \(X \sim U(-1,1)\) \begin{align*} \E[X^2] &= \int_{-1}^1 \frac12 x^2 \, dx \\ &= \frac{1}{6} \left [ x^3 \right]_{-1}^1 \\ &= \frac{1}{3} \end{align*} \begin{align*} \E[X^4] &= \int_{-1}^1 \frac12 x^4 \, dx \\ &= \frac{1}{10} \left [ x^5 \right]_{-1}^1 \\ &= \frac{1}{5} \end{align*} \begin{align*} \var[X^2] &=\E[X^4] - \E[X^2]^2 \\ &= \frac{1}{5} - \frac{1}{9} \\ &= \frac{4}{45} \end{align*} \begin{align*} \E(Z^2) &= \E(Y^2 - 2XY+Z^2) \\ &= \E(Y^2) - 2\E(X)\E(Y)+\E(Z^2) \\ &= \frac{1}{3} - 0 + \frac{1}{3} \\ &= \frac{2}{3} \end{align*} \begin{align*} \E[Z^4] &= \E[Y^4 -4Y^3X+6Y^2X^2-4YX^3+X^4] \\ &= \E[Y^4]-4\E[Y^3]\E[X]+6\E[Y^2]\E[X^2]-4\E[Y]\E[X^3]+\E[X^4] \\ &= \frac{1}{5}+6 \frac{1}{3} \frac13 + \frac{1}{5} \\ &= \frac{2}{5} + \frac{2}{3} \\ &= \frac{16}{15} \end{align*} \begin{align*} \var(Z^2) &= \E(Z^4) - \E(Z^2) \\ &= \frac{16}{15} - \frac{4}{9} \\ &= \frac{28}{45} \\ &= 7 \var(X^2) \end{align*}

2000 Paper 3 Q14
D: 1700.0 B: 1500.0

The random variable \(X\) takes only the values \(x_1\) and \(x_2\) (where \( x_1 \not= x_2 \)), and the random variable \(Y\) takes only the values \(y_1\) and \(y_2\) (where \(y_1 \not= y_2\)). Their joint distribution is given by $$ \P ( X = x_1 , Y = y_1 ) = a \ ; \ \ \P ( X = x_1 , Y = y_2 ) = q - a \ ; \ \ \P ( X = x_2 , Y = y_1 ) = p - a \ . $$ Show that if \(\E(X Y) = \E(X)\E(Y)\) then $$ (a - p q ) ( x_1 - x_2 ) ( y_1 - y_2 ) = 0 . $$ Hence show that two random variables each taking only two distinct values are independent if \(\E(X Y) = \E(X) \E(Y)\). Give a joint distribution for two random variables \(A\) and \(B\), each taking the three values \(- 1\), \(0\) and \(1\) with probability \({1 \over 3}\), which have \(\E(A B) = \E( A)\E (B)\), but which are not independent.


Solution: \begin{align*} \mathbb{P}(X = x_1) &= a + q - a = q \\ \mathbb{P}(X = x_2) &= 1 - q \\ \mathbb{P}(Y = y_1) & = a + p - a = p \\ \mathbb{P}(Y = y_2) & = 1 - p \end{align*} \begin{align*} \mathbb{E}(X)\mathbb{E}(Y) &= \l qx_1 + (1-q)x_2 \r \l p y_1 + (1-p)y_2\r \\ &= qpx_1y_1 + q(1-p)x_1y_2 + (1-q)px_2y_1 + (1-q)(1-p)x_2y_2 \\ \mathbb{E}(XY) &= ax_1y_1 + (q-a)x_1y_2 + (p-a)x_2y_1 + (1 + a - p - q)x_2y_2 &= \end{align*} Therefore \(\mathbb{E}(XY) - \mathbb{E}(X)\mathbb{E}(Y)\) is a degree 2 polynomial in the \(x_i, y_i\). If \(x_1 = x_2\) then we have: \begin{align*} \mathbb{E}(X)\mathbb{E}(Y) &=x_1 \l p y_1 + (1-p)y_2\r \\ \mathbb{E}(XY) &= x_1(ay_1 + (q-a)y_2 + (p-a)y_1 + (1 + a - p - q)y_2) \\ &= x_1 (py_1 + (1-p)y_2) \end{align*} Therefore \(x_1 - x_2\) is a root and by symmetry \(y_1 - y_2\) is a root. Therefore it remains to check the coefficient of \(x_1y_1\) which is \(a - pq\) to complete the factorisation. For any two random variables taking two distinct values, we can find \(a, q, p\) satisfying the relations above. We also note that \(X\) and \(Y\) are independent if \(\mathbb{P}(X = x_i, Y = y_i) = \mathbb{P}(X = x_i)\mathbb{P}(Y = y_i)\). Since \(x_1 \neq x_2\) and \(y_1 \neq y_2\) and \(\E(A B) = \E( A)\E (B) \Rightarrow a = pq\). But if \(a = pq\), we have \(\mathbb{P}(X = x_1, Y = y_1) = \mathbb{P}(X = x_1)\mathbb{P}(Y = y_1)\) and all the other relations drop out similarly. Consider \begin{align*} \mathbb{P}(A = -1, B = 1) &= \frac{1}{6} \\ \mathbb{P}(A = -1, B = -1) &= \frac{1}{6} \\ \mathbb{P}(A = 0, B = 0) &= \frac{1}{3} \\ \mathbb{P}(A = 1, B = -1) &= \frac{1}{6} \\ \mathbb{P}(A = -1, B = -1) &= \frac{1}{6} \end{align*}