2025 Paper 3 Q12

Year: 2025
Paper: 3
Question Number: 12

Course: LFM Stats And Pure
Section: Discrete Probability Distributions

Difficulty: 1500.0 Banger: 1484.0

Problem

  1. Show that, for any functions \(f\) and \(g\), and for any \(m \geq 0\), $$\sum_{r=1}^{m+1} f(r)\sum_{s=r-1}^m g(s) = \sum_{s=0}^m g(s)\sum_{r=1}^{s+1} f(r)$$
  2. The random variables \(X_0, X_1, X_2, \ldots\) are defined as follows:
    • \(X_0\) takes the value \(0\) with probability \(1\);
    • \(X_{n+1}\) takes the values \(0, 1, \ldots, X_n + 1\) with equal probability, for \(n = 0, 1, \ldots\)
    1. Write down \(E(X_1)\). Find \(P(X_2 = 0)\) and \(P(X_2 = 1)\) and show that \(P(X_2 = 2) = \frac{1}{6}\). Hence calculate \(E(X_2)\).
    2. For \(n \geq 1\), show that $$P(X_n = 0) = \sum_{s=0}^{n-1} \frac{P(X_{n-1} = s)}{s+2}$$ and find a similar expression for \(P(X_n = r)\), for \(r = 1, 2, \ldots, n\).
    3. Hence show that \(E(X_n) = \frac{1}{2}(1 + E(X_{n-1}))\). Find an expression for \(E(X_n)\) in terms of \(n\), for \(n = 1, 2, \ldots\)

Solution

  1. \begin{align*} \sum_{r=1}^{m+1} \left (f(r) \sum_{s=r-1}^m g(s) \right) &= \sum_{r=1}^{m+1} \sum_{s=r-1}^m f(r)g(s) \\ &= \sum_{(r,s) \in \{(r,s) : 1 \leq r \leq m+1, 0 \leq s \leq m, s \geq r-1\}} f(r)g(s) \\ &= \sum_{(r,s) \in \{(r,s) : 0 \leq s \leq m, 1 \leq r \leq m+1, r \leq s+1\}} f(r)g(s) \\ &= \sum_{s=0}^m \sum_{r=1}^{s+1} f(r)g(s) \\ &= \sum_{s=0}^m \left ( g(s) \sum_{r=1}^{s+1} f(r) \right) \end{align*}
  2. \(X_1\) takes the values \(0, 1\) with equal probabilities (since \(X_0 = 0\)). Therefore \(\mathbb{E}(X_1) = \frac12\).
    1. \begin{align*} \mathbb{P}(X_2 = 0) &= \mathbb{P}(X_2 = 0 | X_1 = 0) \mathbb{P}(X_1 = 0) + \mathbb{P}(X_2 = 0 | X_1 = 1) \mathbb{P}(X_1 = 1) \\ &= \frac12 \cdot \frac12 + \frac13 \cdot \frac12 \\ &= \frac5{12} \\ \\ \mathbb{P}(X_2 = 1) &= \mathbb{P}(X_2 = 1 | X_1 = 0) \mathbb{P}(X_1 = 0) + \mathbb{P}(X_2 = 1 | X_1 = 1) \mathbb{P}(X_1 = 1) \\ &= \frac12 \cdot \frac12 + \frac13 \cdot \frac12 \\ &= \frac5{12} \\ \\ \mathbb{P}(X_2 = 3) &= 1 - \mathbb{P}(X_2 = 0) - \mathbb{P}(X_2 = 1) \\ &= 1 - \frac{10}{12} = \frac16 \\ \\ \mathbb{E}(X_2) &= \frac{5}{12} + 2\cdot \frac{1}{6} \\ &= \frac34 \end{align*}
    2. \begin{align*} \mathbb{P}(X_n = 0) &= \sum_{s=0}^{n-1} \mathbb{P}(X_n = 0 | X_{n-1} = s)\mathbb{P}(X_{n-1} = s) \\ &= \sum_{s=0}^{n-1} \frac{1}{s+2}\mathbb{P}(X_{n-1} = s) \\ \end{align*} as required. (Where \(\mathbb{P}(X_n = 0 | X_{n-1} = s) = \frac{1}{s+2}\) since if \(X_{n-1} = s\) there are \(0, 1, \ldots, s + 1\) values \(X_n\) can take with equal chance (ie \(s+2\) different values). \begin{align*} \mathbb{P}(X_n = r) &= \sum_{s=0}^{n-1} \mathbb{P}(X_n = r | X_{n-1} = s)\mathbb{P}(X_{n-1} = s) \\ &= \sum_{s=r-1}^{n-1} \frac{\mathbb{P}(X_{n-1}=s)}{s+2} \end{align*}
    3. \begin{align*} \mathbb{E}(X_n) &= \sum_{r=1}^{n} r \cdot \mathbb{P}(X_n = r) \\ &= \sum_{r=1}^{n} r \cdot \sum_{s=r-1}^{n-1} \frac{\mathbb{P}(X_{n-1}=s)}{s+2} \\ &= \sum_{s=0}^{n-1} \frac{\mathbb{P}(X_{n-1}=s)}{s+2} \sum_{r=1}^{s+1} r \\ &= \sum_{s=0}^{n-1} \frac{\mathbb{P}(X_{n-1}=s)}{s+2} \frac{(s+1)(s+2)}{2} \\ &= \frac12 \sum_{s=0}^{n-1} (s+1)\mathbb{P}(X_{n-1}=s) \\ &= \frac12 \sum_{s=0}^{n-1} s\mathbb{P}(X_{n-1}=s) + \frac12 \sum_{s=0}^{n-1} \mathbb{P}(X_{n-1}=s) \\\\ &= \frac12 \left ( \mathbb{E}(X_{n-1}) + 1 \right) \end{align*} Suppose \(\mathbb{E}(X_n) = 1-2^{-n}\), then notice that this expression matches for \(n = 0, 1, 2\) and also: \(\frac12(1 - 2^{-n} + 1) = 1-2^{-n-1}\) satisfies the recusive formula. Therefore by induction (or similar) we can show that \(\mathbb{E}(X_n) = 1- 2^{-n}\).
Examiner's report
— 2025 STEP 3, Question 12
Mean: ~15 / 20 (inferred) Above Average Inferred ~15/20: 'most candidates answered all parts well, many earning full or close to full marks'; highest-scoring question on the paper

Most candidates answered all parts of this question well, with many candidates earning full or close to full marks. In part (i), a small number of candidates erroneously believed that the sum of f(r) times the sum of g(s) with the given limits could be split into a product of two independent sums. Such attempts earned no credit. In part (ii) (a), a significant number of candidates did not understand the concept of X_{n+1} being uniformly distributed on {0, 1, …, X_n + 1}, usually leading to the incorrect values. In part (ii) (b), a number of candidates gave no justification of the written result, simply writing P(X_n = 0) = (1/2)P(X_{n-1} = 0) + (1/3)P(X_{n-1} = 1) + … + (1/(n+1))P(X_{n-1} = n−1) = Σ P(X_{n-1} = s)/(s+2); such attempts earned no credit. In part (ii) (c), most candidates solved this part either by inductively proving that E(X_n) = 1 − 2^{−n} or by noting that E(X_n) − 1 = (1/2)[E(X_{n-1}) − 1] and applying recursion. A smaller number of candidates applied recursion directly to the formula E(X_n) = (1/2)[E(X_{n-1}) + 1] leading to a correct solution via geometric series.

The majority of candidates focused solely on the pure questions, with questions 1, 2 and 8 the most popular. The statistics questions were more popular than the mechanics questions in this exam series. Candidates who did well on this paper generally: were careful to explain and justify the steps in their arguments, explaining what they had done rather than expecting the examiner to infer what had been done from disjointed groups of calculations; paid close attention to what was required by the questions; made fewer unnecessary mistakes with calculations; thought carefully about how to present rigorous arguments involving trig functions and their inverse functions, especially in relation to domain considerations; understood that questions set on the STEP papers require sufficient justification to earn full credit; knew the difference between 'positive' and 'non-negative'; attempted all parts of a question, picking up marks for later parts even when they had not necessarily attempted or completed previous parts. Candidates who did less well on this paper generally: did not pay attention to 'Hence' instructions: this means that you must use the previous part; presented explanations that were not precise enough (e.g. in Question 3 describing the transformations but not in the context of the graphs or in Question 8 not explaining use of trigonometric relationships sufficiently well); made additional assumptions, e.g. that a function was differentiable when this had not been given; tried to present if and only if arguments in a single argument when dealing with each direction separately would have been more appropriate and safer (note that this is not always the case; in general candidates need to consider what is the most appropriate presentation of an if and only if argument); tried to carry out too many steps in one go, resulting in them not justifying the key steps sufficiently; did not take sufficient care with graphs/curve sketching.

Source: Cambridge STEP 2025 Examiner's Report · 2025-p3.pdf
Rating Information

Difficulty Rating: 1500.0

Difficulty Comparisons: 0

Banger Rating: 1484.0

Banger Comparisons: 1

Show LaTeX source
Problem source
\begin{questionparts}
\item Show that, for any functions $f$ and $g$, and for any $m \geq 0$,
$$\sum_{r=1}^{m+1} f(r)\sum_{s=r-1}^m g(s) = \sum_{s=0}^m g(s)\sum_{r=1}^{s+1} f(r)$$
\item The random variables $X_0, X_1, X_2, \ldots$ are defined as follows:
\begin{itemize}
\item $X_0$ takes the value $0$ with probability $1$;
\item $X_{n+1}$ takes the values $0, 1, \ldots, X_n + 1$ with equal probability, for $n = 0, 1, \ldots$
\end{itemize}
\begin{enumerate}
\item Write down $E(X_1)$.
Find $P(X_2 = 0)$ and $P(X_2 = 1)$ and show that $P(X_2 = 2) = \frac{1}{6}$.
Hence calculate $E(X_2)$.
\item For $n \geq 1$, show that 
$$P(X_n = 0) = \sum_{s=0}^{n-1} \frac{P(X_{n-1} = s)}{s+2}$$
and find a similar expression for $P(X_n = r)$, for $r = 1, 2, \ldots, n$.
\item Hence show that $E(X_n) = \frac{1}{2}(1 + E(X_{n-1}))$.
Find an expression for $E(X_n)$ in terms of $n$, for $n = 1, 2, \ldots$
\end{enumerate}
\end{questionparts}
Solution source
\begin{questionparts}
\item \begin{align*}
\sum_{r=1}^{m+1} \left (f(r) \sum_{s=r-1}^m g(s) \right) &= \sum_{r=1}^{m+1} \sum_{s=r-1}^m f(r)g(s) \\
&= \sum_{(r,s) \in \{(r,s) : 1 \leq r \leq m+1, 0 \leq s \leq m, s \geq r-1\}} f(r)g(s)  \\
&= \sum_{(r,s) \in \{(r,s) : 0 \leq s \leq m, 1 \leq r \leq m+1,  r \leq s+1\}} f(r)g(s)  \\
&= \sum_{s=0}^m \sum_{r=1}^{s+1} f(r)g(s) \\
&= \sum_{s=0}^m \left ( g(s) \sum_{r=1}^{s+1} f(r) \right)
\end{align*}

\item $X_1$ takes the values $0, 1$ with equal probabilities (since $X_0 = 0$). Therefore $\mathbb{E}(X_1) = \frac12$.
\begin{enumerate}
\item 
\begin{align*}
\mathbb{P}(X_2 = 0) &= \mathbb{P}(X_2 = 0 | X_1 = 0) \mathbb{P}(X_1 = 0) +  \mathbb{P}(X_2 = 0 | X_1 = 1) \mathbb{P}(X_1 = 1) \\
&= \frac12 \cdot \frac12 + \frac13 \cdot \frac12 \\
&= \frac5{12} \\
\\
\mathbb{P}(X_2 = 1) &= \mathbb{P}(X_2 = 1 | X_1 = 0) \mathbb{P}(X_1 = 0) +  \mathbb{P}(X_2 = 1 | X_1 = 1) \mathbb{P}(X_1 = 1) \\
&= \frac12 \cdot \frac12 + \frac13 \cdot \frac12 \\
&= \frac5{12} \\
\\
\mathbb{P}(X_2 = 3) &= 1 - \mathbb{P}(X_2 = 0) - \mathbb{P}(X_2 = 1) \\
&= 1 - \frac{10}{12} = \frac16 \\
\\
\mathbb{E}(X_2) &= \frac{5}{12} + 2\cdot \frac{1}{6} \\
&= \frac34
\end{align*}
\item \begin{align*}
\mathbb{P}(X_n = 0) &= \sum_{s=0}^{n-1} \mathbb{P}(X_n = 0 | X_{n-1} = s)\mathbb{P}(X_{n-1} = s) \\
&=  \sum_{s=0}^{n-1} \frac{1}{s+2}\mathbb{P}(X_{n-1} = s) \\
\end{align*} as required. (Where $\mathbb{P}(X_n = 0 | X_{n-1} = s) = \frac{1}{s+2}$ since if $X_{n-1} = s$ there are $0, 1, \ldots, s + 1$ values $X_n$ can take with equal chance (ie $s+2$ different values).

\begin{align*}
\mathbb{P}(X_n = r) &= \sum_{s=0}^{n-1} \mathbb{P}(X_n = r | X_{n-1} = s)\mathbb{P}(X_{n-1} = s) \\
&= \sum_{s=r-1}^{n-1} \frac{\mathbb{P}(X_{n-1}=s)}{s+2}
\end{align*}

\item \begin{align*}
\mathbb{E}(X_n) &= \sum_{r=1}^{n} r \cdot \mathbb{P}(X_n = r) \\
&= \sum_{r=1}^{n} r \cdot  \sum_{s=r-1}^{n-1} \frac{\mathbb{P}(X_{n-1}=s)}{s+2} \\ 
&= \sum_{s=0}^{n-1} \frac{\mathbb{P}(X_{n-1}=s)}{s+2} \sum_{r=1}^{s+1} r \\
&= \sum_{s=0}^{n-1} \frac{\mathbb{P}(X_{n-1}=s)}{s+2} \frac{(s+1)(s+2)}{2} \\
&= \frac12 \sum_{s=0}^{n-1} (s+1)\mathbb{P}(X_{n-1}=s) \\
&= \frac12 \sum_{s=0}^{n-1} s\mathbb{P}(X_{n-1}=s) +  \frac12 \sum_{s=0}^{n-1} \mathbb{P}(X_{n-1}=s) \\\\
&= \frac12 \left ( \mathbb{E}(X_{n-1}) + 1 \right)
\end{align*}

Suppose $\mathbb{E}(X_n) = 1-2^{-n}$, then notice that this expression matches for $n = 0, 1, 2$ and also:

$\frac12(1 - 2^{-n} + 1) = 1-2^{-n-1}$ satisfies the recusive formula. Therefore by induction (or similar) we can show that $\mathbb{E}(X_n) = 1- 2^{-n}$.
\end{enumerate}
\end{questionparts}