Problems

Filters
Clear Filters

7 problems found

2021 Paper 3 Q12
D: 1500.0 B: 1500.0

  1. In a game, each member of a team of \(n\) players rolls a fair six-sided die. The total score of the team is the number of pairs of players rolling the same number. For example, if \(7\) players roll \(3, 3, 3, 3, 6, 6, 2\) the total score is \(7\), as six different pairs of players both score \(3\) and one pair of players both score \(6\). Let \(X_{ij}\), for \(1 \leqslant i < j \leqslant n\), be the random variable that takes the value \(1\) if players \(i\) and \(j\) roll the same number and the value \(0\) otherwise. Show that \(X_{12}\) is independent of \(X_{23}\). Hence find the mean and variance of the team's total score.
  2. Show that, if \(Y_i\), for \(1 \leqslant i \leqslant m\), are random variables with mean zero, then \[ \mathrm{Var}(Y_1 + Y_2 + \cdots + Y_m) = \sum_{i=1}^{m} \mathrm{E}(Y_i^2) + 2\sum_{i=1}^{m-1}\sum_{j=i+1}^{m} \mathrm{E}(Y_i Y_j). \]
  3. In a different game, each member of a team of \(n\) players rolls a fair six-sided die. The total score of the team is the number of pairs of players rolling the same even number minus the number of pairs of players rolling the same odd number. For example, if \(7\) players roll \(3, 3, 3, 3, 6, 6, 2\) the total score is \(-5\). Let \(Z_{ij}\), for \(1 \leqslant i < j \leqslant n\), be the random variable that takes the value \(1\) if players \(i\) and \(j\) roll the same even number, the value \(-1\) if players \(i\) and \(j\) roll the same odd number and the value \(0\) otherwise. Show that \(Z_{12}\) is not independent of \(Z_{23}\). Find the mean of the team's total score and show that the variance of the team's total score is \(\dfrac{1}{36}n(n^2 - 1)\).


Solution:

  1. First note that \(\mathbb{P}(X_{ij} = 1) = \frac16\) since it doesn't matter what \(i\) rolls, it only matters that \(j\) rolls the same thing, which happens \(1/6\) of the time. \begin{align*} && \mathbb{P}(X_{12} = 1, X_{23} = 1) &= \mathbb{P}(1, 2\text{ and }3\text{ all roll the same})\\ &&&= \frac{6}{6^3}= \frac1{6^2} \\ &&&= \mathbb{P}(X_{12} = 1)\mathbb{P}(X_{23} = 1) \\ && \mathbb{P}(X_{12} = 1, X_{23} = 0) &= \mathbb{P}(1, 2\text{ roll the same and }3\text{ rolls different}) \\ &&&= \frac{6 \cdot 1 \cdot 5}{6^3} = \frac{5}{6^2} \\ &&&= \mathbb{P}(X_{12} = 1)\mathbb{P}(X_{23} = 0) \\ && \mathbb{P}(X_{12} = 0, X_{23} = 0) &= \mathbb{P}(2, 3 \text{ roll different to} 2)\\ &&&= \frac{6 \cdot 5 \cdot 5}{6^3}= \frac{5^2}{6^2} \\ &&&= \mathbb{P}(X_{12} = 0)\mathbb{P}(X_{23} = 0) \end{align*} Therefore they are independent (the final case is clear by symmetry from case 2). Note that the score is \(S = \sum_{i \neq j} X_{ij}\) so \begin{align*} && \E[S] &= \E \left [ \sum_{i \neq j} X_{ij} \right] \\ &&&= \sum_{i \neq j} \E \left [ X_{ij} \right] \\ &&&= \sum_{i \neq j} \frac16 \\ &&&= \binom{n}{2} \frac16 = \frac{n(n-1)}{12} \\ \\ && \var[S] &= \var \left [ \sum_{i \neq j} X_{ij} \right] \\ &&& \sum_{i \neq j} \var \left [X_{ij} \right] \tag{pairwise ind.} \\ &&&= \binom{n}{2} \frac{5}{36} = \frac{5n(n-1)}{72} \end{align*}
  2. Note that \(\mathbb{P}(Z_{ij} = 1)=\mathbb{P}(Z_{ij} = -1) = \frac{3}{6^2} = \frac{1}{12}\) but that \(\mathbb{P}(Z_{12} = 1, Z_{23} = -1) = 0\). Notice that \(Z_{12}Z_{23}\) is either \(1\) or \(0\) (since \(2\) can't be both odd and even). \(\mathbb{P}(Z_{12}Z_{23} = 1) = \frac{1}{36}\). Notice that \(Z_{ij}, Z_{kl}\) are independent if \(i \neq j \neq k \neq l\) and so \begin{align*} && \E[T] &= \E \left [ \sum_{i \neq j} Z_{ij} \right] \\ &&&= \sum_{i \neq j}\E \left [ Z_{ij} \right] \\ &&&= 0 \\ \\ && \E[T^2] &= \E \left [ \left ( \sum_{i \neq j} Z_{ij} \right)^2 \right] \\ &&&= \E \left [ \sum_{i \neq j} Z_{ij}^2 + \sum_{i \neq j \neq k} Z_{ij}Z_{jk} + \sum_{i \neq j \neq k \neq l} Z_{ij}Z_{kl}\right] \\ &&&= \binom{n}{2} \frac{1}{6} + 2\frac{n(n-1)(n-2)}{2} \frac{1}{36} + 0 \\ &&&= \frac{n(n-1)}{12} + \frac{n(n-1)(n-2)}{6} \\ &&&= \frac{n(n-1)[3 + (n-2)]}{36} \\ &&&= \frac{n(n^2-1)}{36} \end{align*}

2013 Paper 3 Q12
D: 1700.0 B: 1500.0

A list consists only of letters \(A\) and \(B\) arranged in a row. In the list, there are \(a\) letter \(A\)s and \(b\) letter \(B\)s, where \(a\ge2\) and \(b\ge2\), and \(a+b=n\). Each possible ordering of the letters is equally probable. The random variable \(X_1\) is defined by \[ X_1 = \begin{cases} 1 & \text{if the first letter in the row is \(A\)};\\ 0 & \text{otherwise.} \end{cases} \] The random variables \(X_k\) (\(2 \le k \le n\)) are defined by \[ X_k = \begin{cases} 1 & \text{if the \((k-1)\)th letter is \(B\) and the \(k\)th is \(A\)};\\ 0 & \text{otherwise.} \end{cases} \] The random variable \(S\) is defined by \(S = \sum\limits_ {i=1}^n X_i\,\).

  1. Find expressions for \(\E(X_i)\), distinguishing between the cases \(i=1\) and \(i\ne1\), and show that \(\E(S)= \dfrac{a(b+1)}n\,\).
  2. Show that:
    1. for \(j\ge3\), \(\E(X_1X_j) = \dfrac{a(a-1)b}{n(n-1)(n-2)}\,\);
    2. \[ \sum\limits_{i=2}^{n-2} \bigg( \sum\limits_{j=i+2}^n \E(X_iX_j)\bigg) = \dfrac{a(a-1)b(b-1)}{2n(n-1)}\,\]
    3. \(\var(S) = \dfrac {a(a-1)b(b+1)}{n^2(n-1)}\,\).


Solution:

  1. Notice that \(\E[X_1] = \frac{a}{n}\) and consider \(\E[X_i]\) with \(i > 1\). the probability that this is \(1\) is \(\frac{b}{n} \cdot \frac{a}{n-1}\). So \begin{align*} && \E[S] &= \E[X_1] + \sum_{i=2}^n \E[X_i] \\ &&&= \frac{a}{n} + (n-1) \frac{ab}{n(n-1)} \\ &&&= \frac{a(b+1)}{n} \end{align*}
    1. The probability \(X_1X_j = 1\) is \(\frac{a}{n} \cdot \frac{b}{n-1} \cdot \frac{a-1}{n-2} = \frac{a(a-1)b}{n(n-1)(n-2)}\) since there is nothing special about the order, and the first is an \(A\) with probability \(\frac{a}{n}\) and given this occurs there are now \(a-1\) \(A\) and \(n-1\) letters left etc... Therefore \(\E[X_1X_j] = \frac{a(a-1)b}{n(n-1)(n-2)}\)
    2. \(\E[X_iX_j]\) when the pairs don't overlap is \(\frac{a}{n} \frac{b}{n-1} \frac{a-1}{n-2} \frac{b-1}{n-3}\), and so \begin{align*} && \sum\limits_{i=2}^{n-2} \bigg( \sum\limits_{j=i+2}^n \E(X_iX_j)\bigg) &= \sum\limits_{i=2}^{n-2} \bigg( \sum\limits_{j=i+2}^n \frac{a(a-1)b(b-1)}{n(n-1)(n-2)(n-3)}\bigg) \\ &&&= \frac{a(a-1)b(b-1)}{n(n-1)(n-2)(n-3)}\sum\limits_{i=2}^{n-2} \bigg( \sum\limits_{j=i+2}^n 1\bigg) \\ &&&= \frac{a(a-1)b(b-1)}{n(n-1)(n-2)(n-3)}\sum\limits_{i=2}^{n-2} (n-(i+1)) \\ &&&= \frac{a(a-1)b(b-1)}{n(n-1)(n-2)(n-3)} \left ((n-1)(n-3)-\frac{(n-2)(n-1)}{2}+1 \right) \\ &&&= \frac{a(a-1)b(b-1)}{n(n-1)(n-2)(n-3)} \left ( \frac{2n^2-8n-6-n^2+3n-2+2}{2}\right) \\ &&&= \frac{a(a-1)b(b-1)}{n(n-1)(n-2)(n-3)} \left ( \frac{n^2-5n-6}{2}\right) \\ &&&= \frac{a(a-1)b(b-1)}{2n(n-1)} \end{align*}
    3. We also need to consider the other cross terms. \(X_iX_{i+1}=0\). (Since \(X_i = 1\) means the \(i\)th letter is \(A\) and \(X_{i+1} = 1\) means the \(i\)th letter is \(B\)). It's the same story for \(X_1X_2\), and so all the cross terms are accounted for. Therefore \begin{align*} && \E[S^2] &= \E \left [\sum X_i^2 + 2\sum_{i \neq j} X_i X_j \right] \\ &&&= \frac{a(b+1)}{n} +2(n-2)\frac{a(a-1)b}{n(n-1)(n-2)}+ 2 \frac{a(a-1)b(b-1)}{2n(n-1)} \\ &&&= \frac{a(b+1)}{n} +\frac{2a(a-1)b}{n(n-1)} + \frac{a(a-1)b(b-1)}{n(n-1)} \\ &&&= \frac{a(b+1)}{n} +\frac{a(a-1)b(b+1)}{n(n-1)} \\ && \var[S] &= \E[S^2] - \left ( \E[S] \right)^2 \\ &&&= \frac{a(b+1)}{n} + \frac{a(a-1)b(b+1)}{n(n-1)} - \frac{a^2(b+1)^2}{n^2} \\ &&&= \frac{a(b+1) \left (n(n-1) + (a-1)b n -a(b+1)(n-1) \right)}{n^2(n-1)} \\ &&&= \frac{a(b+1) \left ( (n-a)(n-b-1) \right)}{n^2(n-1)} \\ &&&= \frac{a(b+1) \left ( b(a-1) \right)}{n^2(n-1)} \\ \end{align*}

2008 Paper 3 Q13
D: 1700.0 B: 1500.0

A box contains \(n\) pieces of string, each of which has two ends. I select two string ends at random and tie them together. This creates either a ring (if the two ends are from the same string) or a longer piece of string. I repeat the process of tying together string ends chosen at random until there are none left. Find the expected number of rings created at the first step and hence obtain an expression for the expected number of rings created by the end of the process. Find also an expression for the variance of the number of rings created. Given that \(\ln 20 \approx 3\) and that \(1+ \frac12 + \cdots + \frac 1n \approx \ln n\) for large \(n\), determine approximately the expected number of rings created in the case \(n=40\,000\).


Solution: Let \(X_i\) be the indicator variable a loop is formed when there are \(i\) strings in the bag, so \(\mathbb{P}(X_i = 1) = \frac{1}{2i-1}\). Therefore \begin{align*} && Y_n &= X_n + Y_{n-1} \\ && Y_n &= X_n + \cdots + X_1 \\ \Rightarrow && \E[Y_n] &= \frac{1}{2n-1} + \frac{1}{2n-3} + \cdots + \frac{1}{1} \\ && \var[Y_n] &= \sum_{i=1}^n \frac{1}{2i-1} \frac{2i-2}{2i-1} \\ &&&= 2\sum_{i=1}^n \frac{i-1}{(2i-1)^2} \end{align*} \begin{align*} && \E[Y_{n}] &= 1 + \frac13 + \cdots + \frac{1}{2n-1} \\ &&&= 1 + \frac12 + \cdots + \frac1{2n} - \frac12\left (1 + \frac12 + \cdots + \frac1n \right) \\ &&&\approx \ln 2n -\frac12 \ln n \\ &&&= \ln 2 \sqrt{n} \\ \\ && \E[Y_{40\,000}] &= \ln 2 \sqrt{40\,000} \\ &&&= \ln 400 \\ &&&= 2 \ln 20 \approx 6 \end{align*}

2004 Paper 3 Q12
D: 1700.0 B: 1500.0

A team of \(m\) players, numbered from \(1\) to \(m\), puts on a set of a \(m\) shirts, similarly numbered from \(1\) to \(m\). The players change in a hurry, so that the shirts are assigned to them randomly, one to each player. Let \(C_i\) be the random variable that takes the value \(1\) if player \(i\) is wearing shirt \(i\), and 0 otherwise. Show that \(\mathrm{E}\left(C_1\right)={1 \over m}\) and find \(\var \left(C_1\right)\) and \(\mathrm{Cov}\left(C_1 \, , \; C_2 \right) \,\). Let \(\, N = C_1 + C_2 + \cdots + C_m \,\) be the random variable whose value is the number of players who are wearing the correct shirt. Show that \(\mathrm{E}\left(N\right)= \var \left(N\right) = 1 \,\). Explain why a Normal approximation to \(N\) is not likely to be appropriate for any \(m\), but that a Poisson approximation might be reasonable. In the case \(m = 4\), find, by listing equally likely possibilities or otherwise, the probability that no player is wearing the correct shirt and verify that an appropriate Poisson approximation to \(N\) gives this probability with a relative error of about \(2\%\). [Use \(\e \approx 2\frac{72}{100} \,\).]


Solution: There are \(m!\) different ways of assigning the shirts, and in \((m-1)!\) of them player \(1\) gets their own shirt, ie \(\mathbb{E}(C_1) = \mathbb{P}(\text{player }1\text{ gets own shirt}) = \frac{(m-1)!}{m!} = \frac{1}{m}\). \(\var(C_1) = \mathbb{E}(C_1^2) - [\mathbb{E}(C_1)]^2 = \frac{1}{m} - \frac{1}{m^2} = \frac{m-1}{m^2}\). If we have two players, there are \((m-2)!\) ways they both get their own shirts, therefore \(\textrm{Cov}(C_1,C_2) = \mathbb{E}(C_1C_2) - \mathbb{E}(C_1)\mathbb{E}(C_2) = \frac{(m-2)!}{m!} - \frac{1}{m^2} = \frac{1}{m(m-1)} - \frac{1}{m^2} = \frac{m-m+1}{m^2(m-1)} = \frac{1}{m^2(m-1)}\). \begin{align*} \mathbb{E}(N) &= \mathbb{E}(C_1 + C_2 + \cdots + C_m) \\ &= \mathbb{E}(C_1) + \mathbb{E}(C_2) + \cdots + \mathbb{E}(C_m) \\ &= \frac{1}{m} + \frac{1}{m} +\cdots+ \frac1m \\ &= 1 \\ \\ \var(N) &= \sum_{r=1}^m \var(C_r) + 2\sum_{r=1}^{m-1} \sum_{s=2}^{m} \textrm{Cov}(C_r,C_s) \\ &= m \frac{m-1}{m^2} + 2 \frac{m(m-1)}{2}\frac{1}{m^2(m-1)} \\ &=\frac{m-1}{m} + \frac{1}{m} \\ &= 1 \end{align*} If we were to take a normal approximation, we would want to take \(N(1,1)\), but this would say things like \(-1\) is as likely as \(3\) shirts being correct, which is clearly a bad model. A Poisson is much more likely to be a sensible model as they have the same mean and variance as the parameter, and if \(m\) is large, the covariance between shirts is going to be very small, so it will appear similar to random events occurring. We can have \begin{align*} BADC \\ BCDA \\ BDAC \\ CADB \\ CDAB\\ CDBA \\ DABC\\ DCAB \\ DCBA \end{align*} Ie \(\frac{9}{24}\) ways to have no player wearing their own shirt with \(4\) players. \(Po(1)\) would say this probability is \(e^{-1}\), giving a relative error of: \begin{align*} \frac{e^{-1}-\frac{9}{24}}{\frac9{24}} &\approx \frac{\frac{100}{272} - \frac{9}{24}}{\frac9{24}} \\ &= -\frac{1}{51} \\ &\approx -2\% \end{align*}

1999 Paper 1 Q13
D: 1500.0 B: 1484.0

Bar magnets are placed randomly end-to-end in a straight line. If adjacent magnets have ends of opposite polarities facing each other, they join together to form a single unit. If they have ends of the same polarity facing each other, they stand apart. Find the expectation and variance of the number of separate units in terms of the total number \(N\) of magnets.


Solution: There are \(N-1\) gaps between the magnets which are independently gaps or not gaps. Therefore the total number of gaps is \(X \sim Binomial(N-1, \frac12)\) and \begin{align*} \mathbb{E}(X) &= \frac{N-1}{2} \\ \textrm{Var}(X) &= \frac{N-1}{4} \end{align*}

1994 Paper 1 Q13
D: 1500.0 B: 1512.0

I have a bag containing \(M\) tokens, \(m\) of which are red. I remove \(n\) tokens from the bag at random without replacement. Let \[ X_{i}=\begin{cases} 1 & \mbox{ if the ith token I remove is red;}\\ 0 & \mbox{ otherwise.} \end{cases} \] Let \(X\) be the total number of red tokens I remove.

  1. Explain briefly why \(X=X_{1}+X_{2}+\cdots+X_{n}.\)
  2. Find the expectation \(\mathrm{E(}X_{i}).\)
  3. Show that \(\mathrm{E}(X)=mn/M\).
  4. Find \(\mathrm{P}(X=k)\) for \(k=0,1,2,\ldots,n\).
  5. Deduce that \[ \sum_{k=1}^{n}k\binom{m}{k}\binom{M-m}{n-k}=m\binom{M-1}{n-1}. \]


Solution:

  1. The left hand side counts the number of red tokens we have taken. The right hand side counts the number of red tokens we have taken at each point, across all points. Therefore these must be the same.
  2. \(\E[X_i] = \mathbb{P}(i\text{th token is red}) = \frac{m}{M}\) (since there is nothing special about the \(i\)th token.
  3. Therefore \(\E[X] = \E[X_1 + \cdots + X_n] = n\E[X_i] = \frac{nm}{M}\)
  4. \(\mathbb{P}(X=k) = \binom{m}{k}\binom{M-m}{n-k}/\binom{M}{n}\) since this is the number of ways we can choose \(k\) of the \(m\) red objects, \(n-k\) of the \(M-m\) non-red objects divided by the total number of ways we can choose our \(n\) tokens.
  5. \(\,\) \begin{align*} && \frac{mn}{M} &= \E[X] \\ &&&= \sum_{k=1}^n k \mathbb{P}(X=k) \\ &&&= \sum_{k=1}^n k \binom{m}{k}\binom{M-m}{n-k}/\binom{M}{n} \\ \Rightarrow && \sum_{k=1}^n k \binom{m}{k}\binom{M-m}{n-k} &= m \frac{n}{M} \binom{M}{n} = m \binom{M-1}{n-1} \end{align*}
This question is a nice example of how to find the mean of the hypergeometric distribution

1988 Paper 3 Q16
D: 1700.0 B: 1610.5

Balls are chosen at random without replacement from an urn originally containing \(m\) red balls and \(M-m\) green balls. Find the probability that exactly \(k\) red balls will be chosen in \(n\) choices \((0\leqslant k\leqslant m,0\leqslant n\leqslant M).\) The random variables \(X_{i}\) \((i=1,2,\ldots,n)\) are defined for \(n\leqslant M\) by \[ X_{i}=\begin{cases} 0 & \mbox{ if the \(i\)th ball chosen is green}\\ 1 & \mbox{ if the \(i\)th ball chosen is red. } \end{cases} \] Show that

  1. \(\mathrm{P}(X_{i}=1)=\dfrac{m}{M}.\)
  2. \(\mathrm{P}(X_{i}=1\mbox{ and }X_{j}=1)=\dfrac{m(m-1)}{M(M-1)}\), for \(i\neq j\).
Find the mean and variance of the random variable \(X\) defined by \[ X=\sum_{i=1}^{n}X_{i}. \]


Solution: There are \(\displaystyle \binom{m}{k} \binom{M-m}{n-k}\) ways to choose \(k\) red and and \(n-k\) green balls out of a total \(\displaystyle \binom{M}{n}\) ways to choose balls. Therefore the probability is: \[ \mathbb{P}(\text{exactly }k\text{ red balls in }n\text{ choices}) = \frac{\binom{m}{k} \binom{M-m}{n-k}}{ \binom{M}{n}}\]

  1. Note that there is nothing special about the \(i\)th ball chosen. (We could consider all draws look at the \(i\)th ball, or consider all draws apply a permutation to make the \(i\)th ball the first ball, and both would look like identical sequences). Therefore \(\mathbb{P}(X_i = 1) = \mathbb{P}(X_1 = 1) = \frac{m}{M}\).
  2. Similarly we could apply a permutation to all sequences which takes the \(i\)th ball to the first ball and the \(j\)th ball to the second ball, therefore: \begin{align*} \mathbb{P}(X_i = 1, X_j = 1) &= \mathbb{P}(X_1 = 1, X_2 = 1) \\ &= \mathbb{P}(X_1 = 1) \cdot \mathbb{P}(X_2 = 1 | X_1 = 1) \\ &= \frac{m}{M} \cdot \frac{m-1}{M-1} \\ &= \frac{m(m-1)}{M(M-1)} \end{align*}
So: \begin{align*} \mathbb{E}(X) &= \mathbb{E}(\sum_{i=1}^{n}X_{i}) \\ &= \sum_{i=1}^{n}\mathbb{E}(X_{i}) \\ &= \sum_{i=1}^{n} 1\cdot\mathbb{P}(X_i = 1) \\ &= \sum_{i=1}^{n} \frac{m}{M} \\ &= \frac{mn}{M} \end{align*} and \begin{align*} \mathbb{E}(X^2) &= \mathbb{E}\left[\left(\sum_{i=1}^{n}X_{i} \right)^2 \right] \\ &= \mathbb{E}\left[\sum_{i=1}^n X_i^2 + 2 \sum_{i < j} X_i X_j \right] \\ &= \sum_{i=1}^n \mathbb{E}(X_i^2) + 2 \sum_{i < j} \mathbb{E}(X_i X_j) \\ &= \frac{nm}{M} + n(n-1) \frac{m(m-1)}{M(M-1)} \\ \textrm{Var}(X) &= \mathbb{E}(X^2) - (\mathbb{E}(X))^2 \\ &= \frac{nm}{M} + n(n-1) \frac{m(m-1)}{M(M-1)} - \frac{n^2m^2}{M^2} \\ &= \frac{nm}{M} \left (1-\frac{nm}{M}+(n-1)\frac{m-1}{M-1} \right) \\ &= \frac{nm}{M} \left ( \frac{M(M-1)-(M-1)nm+(n-1)(m-1)M}{M(M-1)} \right) \\ &= \frac{nm}{M} \frac{(M-m)(M-n)}{M(M-1)} \\ &= n \frac{m}{M} \frac{M-m}{M} \frac{M-n}{M-1} \end{align*} Note: This is a very nice way of deriving the mean and variance of the hypergeometric distribution