Problems

Filters
Clear Filters

183 problems found

2025 Paper 2 Q11
D: 1500.0 B: 1500.0

  1. By considering the sum of a geometric series, or otherwise, show that \[\sum_{r=1}^{\infty} rx^{r-1} = \frac{1}{(1-x)^2} \quad \text{for } |x| < 1.\]
  2. Ali plays a game with a fair \(2k\)-sided die. He rolls the die until the first \(2k\) appears. Ali wins if all the numbers he rolls are even.
    1. Find the probability that Ali wins the game. If Ali wins the game, he earns £1 for each roll, including the final one. If he loses, he earns nothing.
    2. Find Ali's expected earnings from playing the game.
  3. Find a simplified expression for \[1 + 2\binom{n}{1}x + 3\binom{n}{2}x^2 + \ldots + (n+1)x^n,\] where \(n\) is a positive integer.
  4. Zen plays a different game with a fair \(2k\)-sided die. She rolls the die until the first \(2k\) appears, and wins if the numbers rolled are strictly increasing in size. For example, if \(k = 3\), she wins if she rolls 2, 6 or 1, 4, 5, 6, but not if she rolls 1, 4, 2, 6 or 1, 3, 3, 6. If Zen wins the game, she earns £1 for each roll, including the final one. If she loses, she earns nothing. Find Zen's expected earnings from playing the game.
  5. Using the approximation \[\left(1 + \frac{1}{n}\right)^n \approx e \quad \text{for large } n,\] show that, when \(k\) is large, Zen's expected earnings are a little over 35\% more than Ali's expected earnings.


Solution:

  1. Note that, \begin{align*} && \sum_{r = 0}^\infty x^r &= \frac{1}{1-x} && |x| < 1\\ \underbrace{\Rightarrow}_{\frac{\d}{\d x}} && \sum_{r = 0}^\infty rx^{r-1} &= \frac{1}{(1-x)^2} && |x| < 1\\ && \sum_{r = 1}^\infty rx^{r-1} &= \frac{1}{(1-x)^2} && |x| < 1\\ \end{align*}
    1. \begin{align*} && \mathbb{P}(\text{Ali wins in }s\text{ rounds}) &= \left ( \frac{k-1}{2k} \right)^{s-1} \frac{1}{2k} \\ \Rightarrow && \mathbb{P}(\text{Ali wins}) &= \sum_{s=1}^\infty \mathbb{P}(\text{Ali wins in }s\text{ rounds}) \\ &&&=\sum_{s=1}^\infty \left ( \frac{k-1}{2k} \right)^{s-1} \frac{1}{2k} \\ &&&= \frac{1}{2k} \sum_{s=0}^\infty \left ( \frac{k-1}{2k} \right)^{s} \\ &&&= \frac{1}{2k} \frac{1}{1 - \frac{k-1}{2k}} \\ &&&= \frac{1}{2k - (k-1)} \\ &&&= \frac{1}{k+1} \end{align*}
    2. \begin{align*} \mathbb{E}(\text{Ali score}) &= \sum_{s=1}^{\infty} s \mathbb{P}(\text{Ali wins in }s\text{ rounds}) \\ &= \sum_{s=1}^{\infty} s \left ( \frac{k-1}{2k} \right)^{s-1} \frac{1}{2k} \\ &= \frac{1}{2k} \frac{1}{\left (1 - \frac{k-1}{2k} \right)^2} \\ &= \frac{2k}{(k+1)^2} \end{align*}
  2. \begin{align*} && (1+x)^{n} &= \sum_{k=0}^n \binom{n}{k} x^k \\ \Rightarrow && x(1+x)^n &= \sum_{k=0}^n \binom{n}{k} x^{k+1} \\ \Rightarrow && (1+x)^n + nx(1+x)^{n-1} &= \sum_{k=0}^n (k+1)\binom{n}{k} x^k \\ \Rightarrow && (1+x)^{n-1}(1+(n+1)x) &= 1 + 2\binom{n}{1}x + 3\binom{n}{2}x^2 + \ldots + (n+1)x^n \end{align*}
  3. \begin{align*} \mathbb{E}(\text{Zen score}) &= \sum_{s=1}^{2k} s \mathbb{P} \left ( \text{Zen gets }s\text{ numbers in increasing order ending with }2k \right) \\ &= \sum_{s=1}^{2k} s \binom{2k-1}{s-1} \frac{1}{(2k)^s} \\ &= \frac{1}{2k}\sum_{s=0}^{2k-1} (s+1) \binom{2k-1}{s} \frac{1}{(2k)^s} \\ &= \frac{1}{2k} \left ( 1 + \frac{1}{2k} \right)^{2k-2} \left ( 1 + (2k-1+1) \frac{1}{2k} \right) \\ &= \frac{1}{k}\left ( 1 + \frac{1}{2k} \right)^{2k-2} \end{align*}
  4. Therefore as \(k \to \infty\) \begin{align*} \frac{\mathbb{E}(\text{Zen score})}{\mathbb{E}(\text{Ali score}) } &= \frac{1}{k}\left ( 1 + \frac{1}{2k} \right)^{2k-2} \big / \frac{2k}{(k+1)^2} \\ &= \frac{(k+1)^2}{2k^2} \cdot \left ( 1 + \frac{1}{2k} \right)^{2k} \cdot \left ( 1 + \frac{1}{2k} \right)^{-2} \\ &\to \frac12 e \approx 2.7/2 = 1.35 \end{align*} ie Zen's expected earnings are \(\approx 35\%\) more.

2025 Paper 2 Q12
D: 1500.0 B: 1500.0

Let \(X\) be a Poisson random variable with mean \(\lambda\) and let \(p_r = P(X = r)\), for \(r = 0, 1, 2, \ldots\). Neither \(\lambda\) nor \(\lambda + \frac{1}{2} + \sqrt{\lambda + \frac{1}{4}}\) is an integer.

  1. Show, by considering the sequence \(d_r \equiv p_r - p_{r-1}\) for \(r = 1, 2, \ldots\), that there is a unique integer \(m\) such that \(P(X = r) \leq P(X = m)\) for all \(r = 0, 1, 2, \ldots\), and that \[\lambda - 1 < m < \lambda.\]
  2. Show that the minimum value of \(d_r\) occurs at \(r = k\), where \(k\) is such that \[k < \lambda + \frac{1}{2} + \sqrt{\lambda + \frac{1}{4}} < k + 1.\]
  3. Show that the condition for the maximum value of \(d_r\) to occur at \(r = 1\) is \[1 < \lambda < 2 + \sqrt{2}.\]
  4. In the case \(\lambda = 3.36\), sketch a graph of \(p_r\) against \(r\) for \(r = 0, 1, 2, \ldots, 6, 7\).


Solution:

  1. Suppose \(d_r = p_r - p_{r-1}\) then \begin{align*} d_r &= p_r - p_{r-1} \\ &= \mathbb{P}(X = r) - \mathbb{P}(X = r-1) \\ &= e^{-\lambda} \left ( \frac{\lambda^r}{r!} - \frac{\lambda^{r-1}}{(r-1)!} \right) \\ &= e^{-\lambda} \frac{\lambda^{r-1}}{(r-1)!} \left ( \frac{\lambda}{r} - 1\right) \end{align*} Therefore \(d_r > 0 \Leftrightarrow \lambda > r\)ie, \(p_r\) is increasing while \(r < \lambda\) and reaches a (unique) maximum when \(r = \lfloor \lambda \rfloor\).
  2. Let \(dd_r = d_r - d_{r-1}\), so: \begin{align*} dd_r &= d_r - d_{r-1} \\ &= p_r - 2p_{r-1} + p_{r-2} \\ &= e^{-\lambda} \frac{\lambda^{r-2}}{r!} \left ( \lambda^2 - 2 \lambda r + r(r-1)\right ) \end{align*} Therefore \(dd_r < 0 \Leftrightarrow \lambda^2 - 2\lambda r +r(r-1) < 0 \Leftrightarrow r^2 -(1+2\lambda)r + \lambda^2 < 0\), but this has roots \(r = \frac{(1+2\lambda) \pm \sqrt{(1+2\lambda)^2-4\lambda^2}}{2} = \lambda + \frac12 \pm \sqrt{\lambda + \frac14}\). Therefore \(d_r\) is decreasing when \(r \in \left (\lambda + \frac12 -\sqrt{\lambda + \frac14},\lambda + \frac12 + \sqrt{\lambda + \frac14} \right)\), therefore the possible minimums are \(d_1\) and \(d_k\) where \(k < \lambda + \frac{1}{2} + \sqrt{\lambda + \frac{1}{4}} < k + 1\). \(d_1 = e^{-\lambda}(\lambda - 1)\), \(d_k = e^{-\lambda} \frac{\lambda^{k-1}}{(k-1)!}(\frac{\lambda}{k}-1)\)
  3. If the maximum value of \(d_r\) is \(r = 1\) then \(d_r\) must be decreasing, ie considering \(dd_2\) we have \(\lambda^2 -4\lambda + 2< 0 \Leftrightarrow 2 - \sqrt{2} < \lambda < 2 + \sqrt{2}\). It must also be the case that it doesn't get beaten as \(\lambda \to \infty\). In this case \(d_r \to 0\), so we need \(d_1 > 0\), ie \(\lambda > 1\). Therefore \(1 < \lambda < 2 + \sqrt{2}\)
  4. TikZ diagram

2025 Paper 3 Q12
D: 1500.0 B: 1484.0

  1. Show that, for any functions \(f\) and \(g\), and for any \(m \geq 0\), $$\sum_{r=1}^{m+1} f(r)\sum_{s=r-1}^m g(s) = \sum_{s=0}^m g(s)\sum_{r=1}^{s+1} f(r)$$
  2. The random variables \(X_0, X_1, X_2, \ldots\) are defined as follows:
    • \(X_0\) takes the value \(0\) with probability \(1\);
    • \(X_{n+1}\) takes the values \(0, 1, \ldots, X_n + 1\) with equal probability, for \(n = 0, 1, \ldots\)
    1. Write down \(E(X_1)\). Find \(P(X_2 = 0)\) and \(P(X_2 = 1)\) and show that \(P(X_2 = 2) = \frac{1}{6}\). Hence calculate \(E(X_2)\).
    2. For \(n \geq 1\), show that $$P(X_n = 0) = \sum_{s=0}^{n-1} \frac{P(X_{n-1} = s)}{s+2}$$ and find a similar expression for \(P(X_n = r)\), for \(r = 1, 2, \ldots, n\).
    3. Hence show that \(E(X_n) = \frac{1}{2}(1 + E(X_{n-1}))\). Find an expression for \(E(X_n)\) in terms of \(n\), for \(n = 1, 2, \ldots\)


Solution:

  1. \begin{align*} \sum_{r=1}^{m+1} \left (f(r) \sum_{s=r-1}^m g(s) \right) &= \sum_{r=1}^{m+1} \sum_{s=r-1}^m f(r)g(s) \\ &= \sum_{(r,s) \in \{(r,s) : 1 \leq r \leq m+1, 0 \leq s \leq m, s \geq r-1\}} f(r)g(s) \\ &= \sum_{(r,s) \in \{(r,s) : 0 \leq s \leq m, 1 \leq r \leq m+1, r \leq s+1\}} f(r)g(s) \\ &= \sum_{s=0}^m \sum_{r=1}^{s+1} f(r)g(s) \\ &= \sum_{s=0}^m \left ( g(s) \sum_{r=1}^{s+1} f(r) \right) \end{align*}
  2. \(X_1\) takes the values \(0, 1\) with equal probabilities (since \(X_0 = 0\)). Therefore \(\mathbb{E}(X_1) = \frac12\).
    1. \begin{align*} \mathbb{P}(X_2 = 0) &= \mathbb{P}(X_2 = 0 | X_1 = 0) \mathbb{P}(X_1 = 0) + \mathbb{P}(X_2 = 0 | X_1 = 1) \mathbb{P}(X_1 = 1) \\ &= \frac12 \cdot \frac12 + \frac13 \cdot \frac12 \\ &= \frac5{12} \\ \\ \mathbb{P}(X_2 = 1) &= \mathbb{P}(X_2 = 1 | X_1 = 0) \mathbb{P}(X_1 = 0) + \mathbb{P}(X_2 = 1 | X_1 = 1) \mathbb{P}(X_1 = 1) \\ &= \frac12 \cdot \frac12 + \frac13 \cdot \frac12 \\ &= \frac5{12} \\ \\ \mathbb{P}(X_2 = 3) &= 1 - \mathbb{P}(X_2 = 0) - \mathbb{P}(X_2 = 1) \\ &= 1 - \frac{10}{12} = \frac16 \\ \\ \mathbb{E}(X_2) &= \frac{5}{12} + 2\cdot \frac{1}{6} \\ &= \frac34 \end{align*}
    2. \begin{align*} \mathbb{P}(X_n = 0) &= \sum_{s=0}^{n-1} \mathbb{P}(X_n = 0 | X_{n-1} = s)\mathbb{P}(X_{n-1} = s) \\ &= \sum_{s=0}^{n-1} \frac{1}{s+2}\mathbb{P}(X_{n-1} = s) \\ \end{align*} as required. (Where \(\mathbb{P}(X_n = 0 | X_{n-1} = s) = \frac{1}{s+2}\) since if \(X_{n-1} = s\) there are \(0, 1, \ldots, s + 1\) values \(X_n\) can take with equal chance (ie \(s+2\) different values). \begin{align*} \mathbb{P}(X_n = r) &= \sum_{s=0}^{n-1} \mathbb{P}(X_n = r | X_{n-1} = s)\mathbb{P}(X_{n-1} = s) \\ &= \sum_{s=r-1}^{n-1} \frac{\mathbb{P}(X_{n-1}=s)}{s+2} \end{align*}
    3. \begin{align*} \mathbb{E}(X_n) &= \sum_{r=1}^{n} r \cdot \mathbb{P}(X_n = r) \\ &= \sum_{r=1}^{n} r \cdot \sum_{s=r-1}^{n-1} \frac{\mathbb{P}(X_{n-1}=s)}{s+2} \\ &= \sum_{s=0}^{n-1} \frac{\mathbb{P}(X_{n-1}=s)}{s+2} \sum_{r=1}^{s+1} r \\ &= \sum_{s=0}^{n-1} \frac{\mathbb{P}(X_{n-1}=s)}{s+2} \frac{(s+1)(s+2)}{2} \\ &= \frac12 \sum_{s=0}^{n-1} (s+1)\mathbb{P}(X_{n-1}=s) \\ &= \frac12 \sum_{s=0}^{n-1} s\mathbb{P}(X_{n-1}=s) + \frac12 \sum_{s=0}^{n-1} \mathbb{P}(X_{n-1}=s) \\\\ &= \frac12 \left ( \mathbb{E}(X_{n-1}) + 1 \right) \end{align*} Suppose \(\mathbb{E}(X_n) = 1-2^{-n}\), then notice that this expression matches for \(n = 0, 1, 2\) and also: \(\frac12(1 - 2^{-n} + 1) = 1-2^{-n-1}\) satisfies the recusive formula. Therefore by induction (or similar) we can show that \(\mathbb{E}(X_n) = 1- 2^{-n}\).

2023 Paper 2 Q11
D: 1500.0 B: 1500.0

  1. \(X_1\) and \(X_2\) are both random variables which take values \(x_1, x_2, \ldots, x_n\), with probabilities \(a_1, a_2, \ldots, a_n\) and \(b_1, b_2, \ldots, b_n\) respectively. The value of random variable \(Y\) is defined to be that of \(X_1\) with probability \(p\) and that of \(X_2\) with probability \(q = 1-p\). If \(X_1\) has mean \(\mu_1\) and variance \(\sigma_1^2\), and \(X_2\) has mean \(\mu_2\) and variance \(\sigma_2^2\), find the mean of \(Y\) and show that the variance of \(Y\) is \(p\sigma_1^2 + q\sigma_2^2 + pq(\mu_1 - \mu_2)^2\).
  2. To find the value of random variable \(B\), a fair coin is tossed and a fair six-sided die is rolled. If the coin shows heads, then \(B = 1\) if the die shows a six and \(B = 0\) otherwise; if the coin shows tails, then \(B = 1\) if the die does not show a six and \(B = 0\) if it does. The value of \(Z_1\) is the sum of \(n\) independent values of \(B\), where \(n\) is large. Show that \(Z_1\) is a Binomial random variable with probability of success \(\frac{1}{2}\). Using a Normal approximation, show that the probability that \(Z_1\) is within \(10\%\) of its mean tends to \(1\) as \(n \longrightarrow \infty\).
  3. To find the value of random variable \(Z_2\), a fair coin is tossed and \(n\) fair six-sided dice are rolled, where \(n\) is large. If the coin shows heads, then the value of \(Z_2\) is the number of dice showing a six; if the coin shows tails, then the value of \(Z_2\) is the number of dice not showing a six. Use part (i) to write down the mean and variance of \(Z_2\). Explain why a Normal distribution with this mean and variance will not be a good approximation to the distribution of \(Z_2\). Show that the probability that \(Z_2\) is within \(10\%\) of its mean tends to \(0\) as \(n \longrightarrow \infty\).

2020 Paper 2 Q11
D: 1500.0 B: 1500.0

A coin is tossed repeatedly. The probability that a head appears is \(p\) and the probability that a tail appears is \(q = 1 - p\).

  1. A and B play a game. The game ends if two successive heads appear, in which case A wins, or if two successive tails appear, in which case B wins. Show that the probability that the game never ends is \(0\). Given that the first toss is a head, show that the probability that A wins is \(\dfrac{p}{1 - pq}\). Find and simplify an expression for the probability that A wins.
  2. A and B play another game. The game ends if three successive heads appear, in which case A wins, or if three successive tails appear, in which case B wins. Show that \[\mathrm{P}(\text{A wins} \mid \text{the first toss is a head}) = p^2 + (q + pq)\,\mathrm{P}(\text{A wins} \mid \text{the first toss is a tail})\] and give a similar result for \(\mathrm{P}(\text{A wins} \mid \text{the first toss is a tail})\). Show that \[\mathrm{P}(\text{A wins}) = \frac{p^2(1-q^3)}{1-(1-p^2)(1-q^2)}.\]
  3. A and B play a third game. The game ends if \(a\) successive heads appear, in which case A wins, or if \(b\) successive tails appear, in which case B wins, where \(a\) and \(b\) are integers greater than \(1\). Find the probability that A wins this game. Verify that your result agrees with part (i) when \(a = b = 2\).

2019 Paper 1 Q11
D: 1500.0 B: 1500.0

  1. Two people adopt the following procedure for deciding where to go for a cup of tea: either to a hotel or to a tea shop. Each person has a coin which has a probability \(p\) of showing heads and \(q\) of showing tails (where \(p+q = 1\)). In each round of the procedure, both people toss their coins once. If both coins show heads, then both people go to the hotel; if both coins show tails, then both people go to the tea shop; otherwise, they continue to the next round. This process is repeated until a decision is made. Show that the probability that they make a decision on the \(n\)th round is $$(q^2 + p^2)(2qp)^{n-1}.$$ Show also that the probability that they make a decision on or before the \(n\)th round is at least $$1 - \frac{1}{2^n}$$ whatever the value of \(p\).
  2. Three people adopt the following procedure for deciding where to go for a cup of tea: either to a hotel or to a tea shop. Each person has a coin which has a probability \(p\) of showing heads and \(q\) of showing tails (where \(p + q = 1\)). In the first round of the procedure, all three people toss their coins once. If all three coins show heads, then all three people go to the hotel; if all three coins show tails, then all three people go to the tea shop; otherwise, they continue to the next round. In the next round the two people whose coins showed the same face toss again, but the third person just turns over his or her coin. If all three coins show heads, then all three people go to the hotel; if all three coins show tails, then all three people go to the tea shop; otherwise, they go to the third round. Show that the probability that they make a decision on or before the second round is at least \(\frac{7}{16}\), whatever the value of \(p\).


Solution:

  1. The probability they don't make a decision in a round is \(qp + pq = 2qp\) (TH and HT). The probability they make a decision in a round is \(q^2+p^2\) (TT and HH). Therefore the probability they make a decision in the \(n\)th round is: \[ (q^2+p^2)(2qp)^{n-1} \] by having \(n-1\) failures and one success. The probability they make a decision on or before the \(n\)th round is the \(1-\) the probability they don't, ie \(1 - (2qp)^n\). Notice that \(\sqrt{qp} \leq \frac{p+1}{2} = \frac12 \Rightarrow qp \leq \frac14\) so \(1-(2pq)^n \leq 1 - \frac1{2^n}\)
  2. The probability it's decided in the first round is \(p^3 + q^3\) (HHH, TTT). The probability it's decided in the second round is \(3p^2q \cdot p^2 + 3qq^2 \cdot q^2 = 3pq(p^3+q^3)\) (HHT -> HHH) and (TTH -> TTT) with reorderings). Therefore the probability of making a decision in the first or second round is \((p^3+q^3)(1 + 3pq)\) which is minimised when \(p = q\) by Muirhead (or whatever your favourite inequality is). So \(\frac{2}{8} \cdot \left ( 1 + \frac{3}{4} \right) = \frac{7}{16}\)

2019 Paper 2 Q11
D: 1500.0 B: 1500.0

  1. The three integers \(n_1\), \(n_2\) and \(n_3\) satisfy \(0 < n_1 < n_2 < n_3\) and \(n_1 + n_2 > n_3\). Find the number of ways of choosing the pair of numbers \(n_1\) and \(n_2\) in the cases \(n_3 = 9\) and \(n_3 = 10\). Given that \(n_3 = 2n + 1\), where \(n\) is a positive integer, write down an expression (which you need not prove is correct) for the number of ways of choosing the pair of numbers \(n_1\) and \(n_2\). Simplify your expression. Write down and simplify the corresponding expression when \(n_3 = 2n\), where \(n\) is a positive integer.
  2. You have \(N\) rods, of lengths \(1, 2, 3, \ldots, N\) (one rod of each length). You take the rod of length \(N\), and choose two more rods at random from the remainder, each choice of two being equally likely. Show that, in the case \(N = 2n + 1\) where \(n\) is a positive integer, the probability that these three rods can form a triangle (of non-zero area) is $$\frac{n - 1}{2n - 1}.$$ Find the corresponding probability in the case \(N = 2n\), where \(n\) is a positive integer.
  3. You have \(2M + 1\) rods, of lengths \(1, 2, 3, \ldots, 2M + 1\) (one rod of each length), where \(M\) is a positive integer. You choose three at random, each choice of three being equally likely. Show that the probability that the rods can form a triangle (of non-zero area) is $$\frac{(4M + 1)(M - 1)}{2(2M + 1)(2M - 1)}.$$ Note: \(\sum_{k=1}^{K} k^2 = \frac{1}{6}K(K + 1)(2K + 1)\).


Solution:

  1. If \(n_3 = 9\) and we are looking for \(0 < n_1 < n_2 < n_3\) we can consider values for each \(n_2\). \begin{array}{clc|c} n_2 & \text{range} & \text{count} \\ \hline 6 & 4-5 & 2 \\ 7 & 3-6 & 4 \\ 8 & 2-7 & 6 \\ \hline & & 12 \end{array} When \(n_3 = 10\) \begin{array}{clc|c} n_2 & \text{range} & \text{count} \\ \hline 6 & 5 & 1 \\ 7 & 4-6 & 3 \\ 8 & 3-7 & 5 \\ 9 & 2-8 & 7 \\ \hline & & 16 \end{array} When \(n_3 = 2n+1\) we can have \(2 + 4 + \cdots + 2n-2 = n(n-1)\) When \(n_3 = 2n\) we can have \(1 + 3 + \cdots + 2n-3 = (n-1)^2\)
  2. For the 3 rods to form a triangle, it suffices for the sum of the lengths of the shorter rods to be larger than \(N\). When \(N = 2n+1\) there are \(n(n-1)\) ways this can happen, out of \(\binom{2n}{2}\) ways to choos the numbers, ie \begin{align*} && P &= \frac{n(n-1)}{\frac{2n(2n-1)}{2}} \\ &&&= \frac{n-1}{2n-1} \end{align*} When \(N = 2n\) there are \((n-1)^2\) ways this can happen, out of \(\binom{2n-1}{2}\) ways, ie \begin{align*} && P &= \frac{(n-1)^2}{\frac{(2n-1)(2n-2)}{2}} \\ &&&= \frac{n-1}{2n-1} \end{align*}
  3. The number of ways this can happen is: \begin{align*} C &= \sum_{k=3}^{2M+1} \# \{ \text{triangles where }k\text{ is largest} \} \\ &= \sum_{k=1}^{M} \# \{ \text{triangles where }2k+1\text{ is largest} \} +\sum_{k=1}^{M} \# \{ \text{triangles where }2k\text{ is largest} \}\\ &= \sum_{k=1}^{M} n(n-1)+\sum_{k=1}^{M} (n-1)^2\\ &= \sum_{k=1}^{M} (2n^2-3n+1)\\ &= \frac26M(M+1)(2M+1) - \frac32M(M+1) + M \\ &= \frac16 M(4M+1)(M-1) \end{align*} Therefore the probability is \begin{align*} && P &= \frac{M(4M+1)(M-1)}{6 \binom{2M+1}{3}} \\ &&&= \frac{M(4M+1)(M-1)}{(2M+1)2M(2M-1)} \\ &&&= \frac{(4M+1)(M-1)}{2(2M+1)(2M-1)} \end{align*}

2019 Paper 2 Q12
D: 1500.0 B: 1500.0

The random variable \(X\) has the probability density function on the interval \([0, 1]\): $$f(x) = \begin{cases} nx^{n-1} & 0 \leq x \leq 1, \\ 0 & \text{elsewhere}, \end{cases}$$ where \(n\) is an integer greater than 1.

  1. Let \(\mu = E(X)\). Find an expression for \(\mu\) in terms of \(n\), and show that the variance, \(\sigma^2\), of \(X\) is given by $$\sigma^2 = \frac{n}{(n + 1)^2(n + 2)}.$$
  2. In the case \(n = 2\), show without using decimal approximations that the interquartile range is less than \(2\sigma\).
  3. Write down the first three terms and the \((k + 1)\)th term (where \(0 \leq k \leq n\)) of the binomial expansion of \((1 + x)^n\) in ascending powers of \(x\). By setting \(x = \frac{1}{n}\), show that \(\mu\) is less than the median and greater than the lower quartile. Note: You may assume that $$1 + \frac{1}{1!} + \frac{1}{2!} + \frac{1}{3!} + \cdots < 4.$$


Solution:

  1. \(\,\) \begin{align*} && \mu &= \E[X] \\ &&&= \int_0^1 x f(x) \d x \\ &&&= \int_0^1 nx^n \d x \\ &&&= \frac{n}{n+1} \\ \\ && \var[X] &= \sigma^2 \\ &&&= \E[X^2] - \mu^2 \\ &&&= \int_0^1 x^2 f(x) \d x - \mu^2 \\ &&&= \int_0^1 nx^{n+1} \d x - \mu^2 \\ &&&= \frac{n}{n+2} - \frac{n^2}{(n+1)^2} \\ &&&= \frac{n(n+1)^2 - n^2(n+2)}{(n+1)^2(n+2)} \\ &&&= \frac{n}{(n+1)^2(n+2)} \end{align*}
  2. \(\,\) \begin{align*} && \frac14 &= \int_0^{Q_1} 2x \d x \\ &&&= Q_1^2 \\ \Rightarrow && Q_1 &= \frac12 \\ && \frac34 &= \int_0^{Q_3} 2x \d x \\ &&&= Q_3^2 \\ \Rightarrow && Q_3 &= \frac{\sqrt{3}}2 \\ \\ \Rightarrow && IQR &= Q_3 - Q_1 = \frac{\sqrt{3}-1}{2} \\ && 2 \sigma &= 2\sqrt{\frac{2}{3^2 \cdot 4}} \\ &&&= \frac{\sqrt{2}}{3} \\ \\ && 2\sigma - IRQ &= \frac{\sqrt{2}}{3} - \frac{\sqrt{3}-1}{2} \\ &&&= \frac{2\sqrt{2}-3\sqrt{3}+3}{6} \\ && (3+2\sqrt{2})^2 &= 17+12\sqrt{2} > 29 \\ && (3\sqrt{3})^2 &= 27 \end{align*} Therefore \(2\sigma > IQR\)
  3. \[ (1+x)^n = 1 + nx + \frac{n(n-1)}2 x^2 + \cdots + \binom{n}{k} x^k+ \cdots \] \begin{align*} && Q_1^{-n} &= 4 \\ && Q_2^{-n} &= 2\\ && \mu &=\frac{n}{n+1} \\ \Rightarrow && \mu^{-n} &= \left (1 + \frac1n \right)^n\\ &&&\geq 1 + n \frac1n + \cdots > 2 \\ \Rightarrow && \mu &< Q_2 \\ \\ && \mu^{-n} &= \left (1 + \frac1n \right)^n\\ &&&= 1 + n \frac1n + \frac{n(n-1)}{2!} \frac{1}{n^2} + \cdots + \frac{n(n-1) \cdots (n-k+1)}{k!} \frac{1}{n^k} + \cdots \\ &&&= 1 + 1 + \left (1 - \frac1n \right ) \frac1{2!} + \cdots + \left (1 - \frac1n \right)\cdot\left (1 - \frac2n \right) \cdots \left (1 - \frac{k-1}n \right) \frac{1}{k!} + \cdots \\ &&&< 1 + 1 + \frac1{2!} + \cdots + \frac1{k!} \\ &&&< 4 \\ \Rightarrow && \mu &> Q_1 \end{align*}

2019 Paper 3 Q11
D: 1500.0 B: 1500.0

The number of customers arriving at a builders' merchants each day follows a Poisson distribution with mean \(\lambda\). Each customer is offered some free sand. The probability of any given customer taking the free sand is \(p\).

  1. Show that the number of customers each day who take sand follows a Poisson distribution with mean \(p\lambda\).
  2. The merchant has a mass \(S\) of sand at the beginning of the day. Each customer who takes the free sand gets a proportion \(k\) of the remaining sand, where \(0 \leq k < 1\). Show that by the end of the day the expected mass of sand taken is $$\left(1 - e^{-kp\lambda}\right)S.$$
  3. At the beginning of the day, the merchant's bag of sand contains a large number of grains, exactly one of which is made from solid gold. At the end of the day, the merchant's assistant takes a proportion \(k\) of the remaining sand. Find the probability that the assistant takes the golden grain. Comment on the case \(k = 0\) and on the limit \(k \to 1\). In the case \(p\lambda > 1\) find the value of \(k\) which maximises the probability that the assistant takes the golden grain.


Solution:

  1. Let \(X\) be the number of people arriving on a given day, and \(Y\) be the number taking sand, then \begin{align*} && \mathbb{P}(Y = k) &= \sum_{x=k}^{\infty} \mathbb{P}(x \text{ arrive and }k\text{ of them take sand}) \\ &&&= \sum_{x=k}^{\infty} \mathbb{P}(X=x)\mathbb{P}(k \text{ out of }x\text{ of them take sand})\\ &&&= \sum_{x=k}^{\infty} e^{-\lambda} \frac{\lambda^x}{x!}\binom{x}{k}p^k(1-p)^{x-k}\\ &&&= e^{-\lambda} \left ( \frac{p}{1-p} \right)^k \sum_{x=k}^{\infty} \frac{((1-p)\lambda)^x}{k!(x-k)!} \\ &&&= e^{-\lambda} \left ( \frac{p}{1-p} \right)^k \frac{((1-p)\lambda)^k}{k!} \sum_{x=0}^{\infty} \frac{((1-p)\lambda)^x}{x!} \\ &&&= e^{-\lambda} \left ( \frac{p}{1-p} \right)^k \frac{((1-p)\lambda)^k}{k!}e^{(1-p)\lambda)} \\ &&&= e^{-p\lambda} \frac{(p\lambda)^k}{k!} \end{align*} which is precisely a Poisson with parameter \(p\lambda\). Alternatively, \(Y = B_1 + B_2 + \cdots + B_X\) where \(B_i \sim Bernoulli(p)\) so \(G_Y(t) = G_X(G_B(t)) = G_X(1-p+pt) = e^{-\lambda(1-(1-p+pt))} = e^{-p\lambda(1-t)}\) so \(Y \sim Po(\lambda)\) Alternatively, alternatively, let \(Z\) be the number of people not taking sand, so \begin{align*} && \mathbb{P}(Y = y, Z= z) &= \mathbb{P}(X=y+z) \cdot \binom{y+z}{y} p^y(1-p)^z \\ &&&= e^{-\lambda} \frac{\lambda^{y+z}}{(y+z)!} \frac{(y+z)!}{y!z!} p^y(1-p)^z \\ &&&=\left ( e^{-p\lambda} \frac{(p\lambda)^y}{y!} \right) \cdot \left ( e^{-(1-p)\lambda} \frac{((1-p)\lambda)^z}{z!}\right) \end{align*} So clearly \(Y\) and \(Z\) are both (independent!) Poisson with parameters \(p\lambda \) and \((1-p)\lambda\)
  2. The amount taken is \(Sk + S(1-k)k + \cdots +Sk(1-k)^{Y-1} = Sk\cdot \frac{1-(1-k)^Y}{k} = S(1-(1-k)^Y)\) so \begin{align*} \E[\text{taken sand}] &= \E \left [ S(1-(1-k)^Y)\right] \\ &= S-S\E\left [(1-k)^Y \right] \\ &= S - SG_Y(1-k)\\ &=S - Se^{-p\lambda(1-(1-k))} \tag{pgf for Poisson} \\ &= S\left (1-e^{-kp\lambda} \right) \end{align*}
  3. The fraction of grains the assistant takes home is: \((1-k)^Yk\), which has expected value \(ke^{-kp\lambda}\). This the the probability he takes home the golden grain. When \(k = 0\) the probability is \(0\) which makes sense (no-one takes home any sand, including the merchant's assistant). As \(k \to 1\) we get \(e^{-p\lambda}\) which is the probability that no-one gets any sand other than him. \begin{align*} && \frac{\d }{\d k} \left ( ke^{-kp\lambda} \right) &= e^{-kp\lambda} - (p\lambda)ke^{-kp\lambda} \\ &&&= e^{-kp\lambda}(1 - (p\lambda)k) \end{align*} Therefore maximised at \(k = \frac{1}{p\lambda}\). (Clearly this is a maximum just by sketching the function)

2019 Paper 3 Q12
D: 1500.0 B: 1485.6

The set \(S\) is the set of all integers from 1 to \(n\). The set \(T\) is the set of all distinct subsets of \(S\), including the empty set \(\emptyset\) and \(S\) itself. Show that \(T\) contains exactly \(2^n\) sets. The sets \(A_1, A_2, \ldots, A_m\), which are not necessarily distinct, are chosen randomly and independently from \(T\), and for each \(k\) \((1 \leq k \leq m)\), the set \(A_k\) is equally likely to be any of the sets in \(T\).

  1. Write down the value of \(P(1 \in A_1)\).
  2. By considering each integer separately, show that \(P(A_1 \cap A_2 = \emptyset) = \left(\frac{3}{4}\right)^n\). Find \(P(A_1 \cap A_2 \cap A_3 = \emptyset)\) and \(P(A_1 \cap A_2 \cap \cdots \cap A_m = \emptyset)\).
  3. Find \(P(A_1 \subseteq A_2)\), \(P(A_1 \subseteq A_2 \subseteq A_3)\) and \(P(A_1 \subseteq A_2 \subseteq \cdots \subseteq A_m)\).


Solution: For every element in \(S\) we can choose whether or not it appears in a subset of \(S\), therefore there are \(2^n\) choices so \(2^n\) distinct subsets.

  1. \(\mathbb{P}(1 \in A_1) = \frac12\) (since \(1\) is in exactly half the subsets)
  2. \(\,\) \begin{align*} && \mathbb{P}(A_1 \cap A_2 = \emptyset) &= \mathbb{P}(i \not \in (A_1 \cap A_2) \forall i) \\ &&&= \prod_{i=1}^n \left ( 1-\mathbb{P}(i \in A_1 \cap A_2) \right) \\ &&&= \prod_{i=1}^n \left ( 1-\mathbb{P}(i \in A_1)\mathbb{P}(i \in \cap A_2) \right) \\ &&&= \prod_{i=1}^n \left ( 1-\frac12 \cdot \frac12\right) \\ &&&= \left (\frac34 \right)^n \end{align*}
  3. \(\,\) \begin{align*} && \mathbb{P}(A_1 \cap A_2 \cap A_3 = \emptyset) &= \mathbb{P}(i \not \in (A_1 \cap A_2 \cap A_3) \forall i) \\ &&&= \prod_{i=1}^n \left ( 1-\mathbb{P}(i \in A_1 \cap A_2 \cap A_3) \right) \\ &&&= \prod_{i=1}^n \left ( 1-\mathbb{P}(i \in A_1)\mathbb{P}(i \in \cap A_2))\mathbb{P}(i \in \cap A_3) \right) \\ &&&= \prod_{i=1}^n \left ( 1-\frac12 \cdot \frac12 \cdot \frac12\right) \\ &&&= \left (\frac78 \right)^n \end{align*} Similarly, \(\displaystyle \mathbb{P}(A_1 \cap A_2 \cap \cdots \cap A_m = \emptyset) = \left ( \frac{2^m-1}{2^m} \right)^n\)
  4. \(\,\) \begin{align*} && \mathbb{P}(A_1 \subseteq A_2) &= \mathbb{P}(A_1 \cap A_2^c = \emptyset) \\ &&&= \left (\frac34 \right)^n \\ \\ && \mathbb{P}(A_1 \subseteq A_2 \subseteq A_3) &= \prod_{i=1}^n \mathbb{P}(\text{once }i\text{ appears it keeps appearing}) \\ &&&= \prod_{i=1}^n \frac{\#\{(0,0,0), (0,0,1), (0,1,1), (1,1,1) \}}{2^3} \\ &&&= \prod_{i=1}^n \frac{4}{8} \\ &&&= \frac{1}{2^n} \\ \\ && \mathbb{P}(A_1 \subseteq A_2 \subseteq \cdots \subseteq A_m) &= \prod_{i=1}^n \frac{m+1}{2^m} \\ &&&= \left ( \frac{m+1}{2^m} \right)^n \end{align*}

2018 Paper 1 Q11
D: 1500.0 B: 1513.7

A bag contains three coins. The probabilities of their showing heads when tossed are \(p_1\), \(p_2\) and \(p_3\).

  1. A coin is taken at random from the bag and tossed. What is the probability that it shows a head?
  2. A coin is taken at random from the bag (containing three coins) and tossed; the coin is returned to the bag and again a coin is taken at random from the bag and tossed. Let \(N_1\) be the random variable whose value is the number of heads shown on the two tosses. Find the expectation of \(N_1\) in terms of \(p\), where \(p = \frac{1}{3}(p_1+p_2+p_3)\,\), and show that \(\var(N_1) =2p(1-p)\,\).
  3. Two of the coins are taken at random from the bag (containing three coins) and tossed. Let \(N_2\) be the random variable whose value is the number of heads showing on the two coins. Find \(\E(N_2)\) and \(\var(N_2)\).
  4. Show that \(\var(N_2)\le \var(N_1)\), with equality if and only if \(p_1=p_2=p_3\,\).


Solution:

  1. \(\mathbb{P}(\text{head}) = \mathbb{P}(\text{head}|1)\mathbb{P}(\text{coin 1}) + \mathbb{P}(\text{head}|2)\mathbb{P}(\text{coin 2})+\mathbb{P}(\text{head}|3)\mathbb{P}(\text{coin 3}) = \frac13(p_1+p_2+p_3)\)
  2. \(N_1 = X_1 + X_2\) where \(X_i \sim Bernoulli(p)\), therefore \(\mathbb{E}(N_1) = 2p\) and \(\textrm{Var}(N_1) = \textrm{Var}(X_1)+ \textrm{Var}(X_2) = p(1-p)+p(1-p) = 2p(1-p)\)
  3. Let \(Y_i\) be the indicator for the \(i\)th coin is heads. Then \(\mathbb{E}(Y_i) = p\) and so \(\mathbb{E}(N_2) = 2p\). \begin{align*} && \textrm{Var}(N_2) &= \mathbb{E}(N_2^2) - [\mathbb{E}(N_2)]^2\\ &&&= 2^2 \cdot \left (\frac13 \left (p_1p_2+p_2p_3+p_3p_1 \right) \right) + 1 \cdot \left (\frac13 \left (p_1 (1-p_2) + (1-p_1)p_2 + p_2(1-p_3) +(1-p_2)p_3 + p_3(1-p_1) + (1-p_3)p_1 \right) \right) - [\mathbb{E}(N_2)]^2 \\ &&&= \frac43\left (p_1p_2+p_2p_3+p_3p_1 \right) + \frac13 \left ( 2(p_1+p_2+p_3) - 2(p_1p_2+p_2p_3+p_3p_1)\right)-[\mathbb{E}(N_2)]^2 \\ &&&= \frac23\left (p_1p_2+p_2p_3+p_3p_1 \right) + \frac23 \left ( p_1+p_2+p_3 \right)-[\mathbb{E}(N_2)]^2\\ &&&= \frac23\left (p_1p_2+p_2p_3+p_3p_1 \right) + \frac23 \left ( p_1+p_2+p_3 \right)-\left[\frac23(p_1+p_2+p_3)\right]^2\\ &&&= \frac23\left (p_1p_2+p_2p_3+p_3p_1 \right) +2p(1-2p)\\ \end{align*}
  4. \(\,\) \begin{align*} && \textrm{Var}(N_1) - \textrm{Var}(N_2) &= 2p(1-p) - \left (\frac23\left (p_1p_2+p_2p_3+p_3p_1 \right) +2p(1-2p) \right) \\ &&&= 2p^2-\frac23\left (p_1p_2+p_2p_3+p_3p_1 \right) \\ &&&= \frac23 \left ( \frac13(p_1+p_2+p_3)^2 -\left (p_1p_2+p_2p_3+p_3p_1 \right)\right)\\ &&&= \frac29 \left (p_1^2+p_2^2+p_3^2 -(p_1p_2+p_2p_3+p_3p_1) \right)\\ &&&= \frac19 \left ((p_1-p_2)^2+(p_2-p_3)^2+(p_3-p_1)^2 \right) &\geq 0 \end{align*} with equality iff \(p_1 = p_2 = p_3\)

2018 Paper 1 Q12
D: 1500.0 B: 1500.0

A multiple-choice test consists of five questions. For each question, \(n\) answers are given (\(n\ge2\)) only one of which is correct and candidates either attempt the question by choosing one of the \(n\) given answers or do not attempt it. For each question attempted, candidates receive two marks for the correct answer and lose one mark for an incorrect answer. No marks are gained or lost for questions that are not attempted. The pass mark is five. Candidates A, B and C don't understand any of the questions so, for any question which they attempt, they each choose one of the \(n\) given answers at random, independently of their choices for any other question.

  1. Candidate A chooses in advance to attempt exactly \(k\) of the five questions, where \(k=0, 1, 2, 3, 4\) or \(5\). Show that, in order to have the greatest probability of passing the test, she should choose \(k=4\,\).
  2. Candidate B chooses at random the number of questions he will attempt, the six possibilities being equally likely. Given that Candidate B passed the test find, in terms of \(n\), the probability that he attempted exactly four questions. [Not on original test: Show that this probability is an increasing function of \(n\).]
  3. For each of the five questions Candidate C decides whether to attempt the question by tossing a biased coin. The coin has a probability of \(\frac n{n+1}\) of showing a head, and she attempts the question if it shows a head. Find the probability, in terms of \(n\), that Candidate C passes the test.


Solution:

  1. Her probability of passing if she answers \(k \leq 2\) is \(0\), since she can attain at most \(4\) marks. If she attempts \(3\) questions, she needs to get all of them right, hence \(\mathbb{P}(\text{gets all }3\text{ correct}) = \frac{1}{n^3}\). If she attempts \(4\) questions, we can afford to get one wrong \begin{align*} && \mathbb{P}(\text{passes}|\text{attempts }4) &=\mathbb{P}(4/4) +\mathbb{P}(3/4) \\ &&&= \frac{1}{n^4} + 4\cdot\frac{1}{n^3} \cdot \frac{n-1}{n} \\ &&&= \frac{4n-3}{n^4} \end{align*} If she attempts \(5\) questions she can get \(5\) right (10), \(4\) right, \(1\) wrong (7), but \(3\) right will not work (\(6 - 2 = 4< 5\)), hence: \begin{align*} && \mathbb{P}(\text{passes}|\text{attempts }5) &=\mathbb{P}(5/5) +\mathbb{P}(4/5) \\ &&&= \frac{1}{n^5} + 5\cdot\frac{1}{n^4} \cdot \frac{n-1}{n} \\ &&&= \frac{5n-4}{n^5} \end{align*} If \(4n-3 > n \Leftrightarrow n \geq 2\) then \(4\) attempts is better than \(3\). If \(4n^2-3n > 5n-4 \Leftrightarrow 4n^2-8n+4 = 4(n-1)^2 > 0 \Leftrightarrow n \geq\) then \(4\) is better than \(5\), but \(n\) is \(\geq 2\) so, \(4\) is the best option.
  2. \(\,\) \begin{align*} && \mathbb{P}(\text{passes}) &= \frac16 \cdot 0 + \frac16 \cdot 0 + \frac16 \cdot 0 + \frac16 \cdot \frac1{n^3} + \frac16 \frac{4n-3}{n^4} + \frac16 \frac{5n-4}{n^5} \\ &&&= \frac{n^2+4n^2-3n+5n-4}{6n^5} \\ &&&= \frac{5n^2+2n-4}{6n^5} \\ && \mathbb{P}(\text{answered }4|\text{passes}) &= \frac{\mathbb{P}(\text{answered }4\text{ and passes})}{ \mathbb{P}(\text{passes})} \\ &&&= \frac{\frac16 \cdot \frac{4n-3}{n^4}}{\frac{5n^2-2n-4}{6n^5} } \\ &&&= \frac{4n^2-3n}{5n^2+2n-4} \end{align*} Notice that the function takes all values for \(n\) between the roots of the denominator (which are either side of \(0\) and below \(3/4\). Therefore after \(3/4\) the function must be increase since otherwise we would have a quadratic equation with more than \(2\) roots.
  3. \(\,\) \begin{align*} &&\mathbb{P}(C \text{ passes}) &= \binom{5}{3} \left ( \frac{n}{n+1} \right)^3 \left ( \frac{1}{n+1}\right)^2 \frac{1}{n^3} + \binom{5}{4} \left ( \frac{n}{n+1} \right)^4 \left ( \frac{1}{n+1}\right) \frac{4n-3}{n^4} +\\ &&&\quad \quad + \binom{5}{5} \left ( \frac{n}{n+1} \right)^5 \frac{5n-4}{n^5} \\ &&&= \frac{10}{(n+1)^5} + \frac{5(4n-3)}{(n+1)^5} + \frac{(5n-4)}{(n+1)^5} \\ &&&= \frac{10+20n-15+5n-4}{(n+1)^5}\\ &&&= \frac{25n-9}{(n+1)^5}\\ \end{align*}

2018 Paper 2 Q12
D: 1600.0 B: 1500.0

In a game, I toss a coin repeatedly. The probability, \(p\), that the coin shows Heads on any given toss is given by \[ p= \frac N{N+1} \,, \] where \(N\) is a positive integer. The outcomes of any two tosses are independent. The game has two versions. In each version, I can choose to stop playing after any number of tosses, in which case I win £\(H\), where \(H\) is the number of Heads I have tossed. However, the game may end before that, in which case I win nothing.

  1. In version 1, the game ends when the coin first shows Tails (if I haven't stopped playing before that). I decide from the start to toss the coin until a total of \(h\) Heads have been shown, unless the game ends before then. Find, in terms of \(h\) and \(p\), an expression for my expected winnings and show that I can maximise my expected winnings by choosing \(h=N\).
  2. In version 2, the game ends when the coin shows Tails on two consecutive tosses (if I haven't stopped playing before that). I decide from the start to toss the coin until a total of \(h\) Heads have been shown, unless the game ends before then. Show that my expected winnings are \[ \frac{ hN^h (N+2)^h}{(N+1)^{2h}} \,.\] In the case \(N=2\,\), use the approximation \(\log_3 2 \approx 0.63\) to show that the maximum value of my expected winnings is approximately £3.


Solution:

  1. Since we either win \(h\) or \(0\), to calculate the expected winnings we just need to calculate the probability that we get \(h\) consecutive heads, therefore: \begin{align*} && \mathbb{E}(\text{winnings}) &= E_h \\ &&&= h \cdot \left ( \frac{N}{N+1} \right)^h \\ && \frac{E_{h+1}}{E_h} &= \frac{h+1}{h }\left ( \frac{N}{N+1} \right) \end{align*} Therefore \(E_h\) is increasing if \(h \leq N\), so we can maximise our winnings by taking \(h = N\). (In fact, we could take \(h = N\) or \(h = N+1\), but arguably \(h = N\) is better as we have the same expected value but lower variance).
  2. We can have up to \(h\) tails appearing (if we imagine slots for tails of the form \(\underbrace{\_H\_H\_H\_\cdots\_H}_{h\text{ spaces and }h\, H}\) so, we have \begin{align*} && \mathbb{P}(\text{wins}) &= \sum_{t = 0}^h \mathbb{P}(\text{wins and } t\text{ tails}) \\ &&&= \sum_{t = 0}^h\binom{h}{t} \left ( \frac{N}{N+1} \right)^h\left ( \frac{1}{N+1} \right)^t \\ &&&= \left ( \frac{N}{N+1} \right)^h \sum_{t = 0}^h\binom{h}{t}\left ( \frac{1}{N+1} \right)^t \cdot 1^{h-t} \\ &&&= \left ( \frac{N}{N+1} \right)^h \left ( 1 + \left ( \frac{1}{N+1} \right) \right)^h \\ &&&= \left ( \frac{N}{N+1} \right)^h \left ( \frac{N+2}{N+1}\right)^h \\ &&&= \frac{N^h(N+2)^h}{(N+1)^{2h}} \\ \Rightarrow && \E(\text{winnings}) &= h \cdot \frac{N^h(N+2)^h}{(N+1)^{2h}} \end{align*} If \(N = 2\), we have \begin{align*} && \E(\text{winnings}) &= E_h \\ &&&= h \cdot \frac{2^h\cdot2^{2h}}{3^{2h}}\\ &&&= h \cdot \frac{2^{3h}}{3^{2h}} \\ \Rightarrow && \frac{E_{h+1}}{E_h} &= \frac{h+1}{h} \frac{8}{9} \\ \end{align*} Therefore to maximise the winnings we should take \(h = 8\), and the expected winnings will be: \begin{align*} && E_8 &= 8 \cdot \frac{2^{24}}{3^{16}} \\ \Rightarrow && \log_3 E_8 &= 27 \log_3 2 - 16 \\ &&&\approx 24 \cdot 0.63 - 16 \\ &&&\approx 17 - 16 \\ &&&\approx 1 \\ \Rightarrow && E_8 &\approx 3 \end{align*}

2018 Paper 2 Q13
D: 1600.0 B: 1502.8

Four children, \(A\), \(B\), \(C\) and \(D\), are playing a version of the game `pass the parcel'. They stand in a circle, so that \(ABCDA\) is the clockwise order. Each time a whistle is blown, the child holding the parcel is supposed to pass the parcel immediately exactly one place clockwise. In fact each child, independently of any other past event, passes the parcel clockwise with probability \(\frac{1}{4}\), passes it anticlockwise with probability \(\frac{1}{4}\) and fails to pass it at all with probability \(\frac{1}{2}\). At the start of the game, child \(A\) is holding the parcel. The probability that child \(A\) is holding the parcel just after the whistle has been blown for the \(n\)th time is \(A_n\), and \(B_n\), \(C_n\) and \(D_n\) are defined similarly.

  1. Find \(A_1\), \(B_1\), \(C_1\) and \(D_1\). Find also \(A_2\), \(B_2\), \(C_2\) and \(D_2\).
  2. By first considering \(B_{n+1}+D_{n+1}\), or otherwise, find \(B_n\) and \(D_n\). Find also expressions for \(A_n\) and \(C_n\) in terms of \(n\).


Solution:

  1. \(\,\) \begin{align*} && A_1 &= \frac12 \\ && B_1 &= \frac14 \\ && C_1 &= 0 \\ && D_1 &= \frac14 \end{align*} \begin{align*} && A_2 &= \frac12 \cdot \frac12 + 2 \cdot \frac14 \cdot \frac14 = \frac38 \\ && B_2 &= \frac14 \cdot \frac12 + \frac12 \cdot \frac14 = \frac14 \\ && C_2 &=2 \cdot \frac14 \cdot \frac14 =\frac18 \\ && D_2 &= B_2 = \frac14 \end{align*}
  2. \begin{align*} && A_{n+1} &= \frac12 A_n+ \frac14(B_n + D_n) \\ && B_{n+1} &= \frac12 B_n+ \frac14(A_n + C_n) \\ && C_{n+1} &= \frac12 C_n+ \frac14(D_n +B_n) \\ && D_{n+1} &= \frac12 D_n+ \frac14(C_n +A_n) \\ \\ \Rightarrow && B_{n+1}+D_{n+1} &= \frac12 (B_n+D_n) + \frac12(A_n+C_n) \\ &&&= \frac12 \\ \Rightarrow && B_{n+1}&=D_{n+1} = \frac14 \\ \\ && C_{n+1} &= \frac12C_n + \frac14 \cdot \frac12 \\ &&&= \frac12 C_n + \frac18\\ &&&= \frac12 C_{n-1} + \frac1{8} + \frac1{16} \\ &&&= \frac1{8} + \frac{1}{16} + \cdots + \frac{1}{8 \cdot 2^{n-1}} \\ &&&= \frac18 \left (1 + \frac12 + \cdots + \frac1{2^{n-1}} \right) \\ &&&= \frac18\left ( \frac{1-\frac1{2^n}}{1-\frac12} \right) \\ &&&= \frac18 \left (2 - \frac{1}{2^{n-1}} \right) \\ &&&= \frac14 - \frac{1}{2^{n-1}} \\ \Rightarrow && A_n &= \frac14 + \frac1{2^{n-1}} \end{align*}

2018 Paper 3 Q12
D: 1700.0 B: 1516.0

A random process generates, independently, \(n\) numbers each of which is drawn from a uniform (rectangular) distribution on the interval 0 to 1. The random variable \(Y_k\) is defined to be the \(k\)th smallest number (so there are \(k-1\) smaller numbers).

  1. Show that, for \(0\le y\le1\,\), \[ {\rm P}\big(Y_k\le y) =\sum^{n}_{m=k}\binom{n}{m}y^{m}\left(1-y\right)^{n-m} . \tag{\(*\)} \]
  2. Show that \[ m\binom n m = n \binom {n-1}{m-1} \] and obtain a similar expression for \(\displaystyle (n-m) \, \binom n m\,\). Starting from \((*)\), show that the probability density function of \(Y_k\) is \[ n\binom{ n-1}{k-1} y^{k-1}\left(1-y\right)^{ n-k} \,.\] Deduce an expression for \(\displaystyle \int_0^1 y^{k-1}(1-y)^{n-k} \, \d y \,\).
  3. Find \(\E(Y_k) \) in terms of \(n\) and \(k\).


Solution:

  1. \begin{align*} && \mathbb{P}(Y_k \leq y) &= \sum_{j=k}^n\mathbb{P}(\text{exactly }j \text{ values less than }y) \\ &&&= \sum_{j=k}^m \binom{m}{j} y^j(1-y)^{n-j} \end{align*}
  2. This is the number of ways to choose a committee of \(m\) people with the chair from those \(m\) people. This can be done in two ways. First: choose the committee in \(\binom{n}{m}\) ways and choose the chair in \(m\) ways so \(m \binom{n}{m}\). Alternatively, choose the chain in \(n\) ways and choose the remaining \(m-1\) committee members in \(\binom{n-1}{m-1}\) ways. Therefore \(m \binom{n}{m} = n \binom{n-1}{m-1}\) \begin{align*} (n-m) \binom{n}{m} &= (n-m) \binom{n}{n-m} \\ &= n \binom{n-1}{n-m-1} \\ &= n \binom{n-1}{m} \end{align*} \begin{align*} f_{Y_k}(y) &= \frac{\d }{\d y} \l \sum^{n}_{m=k}\binom{n}{m}y^{m}\left(1-y\right)^{n-m} \r \\ &= \sum^{n}_{m=k} \l \binom{n}{m}my^{m-1}\left(1-y\right)^{n-m} -\binom{n}{m}(n-m)y^{m}\left(1-y\right)^{n-m-1} \r \\ &= \sum^{n}_{m=k} \l n \binom{n-1}{m-1}y^{m-1}\left(1-y\right)^{n-m} -n \binom{n-1}{m} y^{m}\left(1-y\right)^{n-m-1} \r \\ &= n\sum^{n}_{m=k} \binom{n-1}{m-1}y^{m-1}\left(1-y\right)^{n-m} -n\sum^{n+1}_{m=k+1} \binom{n-1}{m-1} y^{m-1}\left(1-y\right)^{n-m} \\ &= n \binom{n-1}{k-1} y^{k-1}(1-y)^{n-k} \end{align*} \begin{align*} &&1 &= \int_0^1 f_{Y_k}(y) \d y \\ &&&= \int_0^1 n \binom{n-1}{k-1} y^{k-1}(1-y)^{n-k} \d y \\ &&&= n \binom{n-1}{k-1} \int_0^1 y^{k-1}(1-y)^{n-k} \d y \\ \Rightarrow && \frac{1}{n \binom{n-1}{k-1}} &= \int_0^1 y^{k-1}(1-y)^{n-k} \d y \\ \end{align*}
  3. \begin{align*} && \mathbb{E}(Y_k) &= \int_0^1 y f_{Y_k}(y) \d y \\ &&&= \int_0^1 n \binom{n-1}{k-1} y^{k}(1-y)^{n-k} \\ &&&= n \binom{n-1}{k-1}\int_0^1 y^{k}(1-y)^{n-k} \d y \\ &&&= n \binom{n-1}{k-1}\int_0^1 y^{k+1-1}(1-y)^{n+1-(k+1)} \d y \\ &&&= n \binom{n-1}{k-1} \frac{1}{(n+1) \binom{n}{k}}\\ &&&= \frac{n}{n+1} \cdot \frac{k}{n} \\ &&&= \frac{k}{n+1} \end{align*}