Problems

Filters
Clear Filters

5 problems found

2022 Paper 3 Q11
D: 1500.0 B: 1500.0

A fair coin is tossed \(N\) times and the random variable \(X\) records the number of heads. The mean deviation, \(\delta\), of \(X\) is defined by \[ \delta = \mathrm{E}\big(|X - \mu|\big) \] where \(\mu\) is the mean of \(X\).

  1. Let \(N = 2n\) where \(n\) is a positive integer.
    1. Show that \(\mathrm{P}(X \leqslant n-1) = \frac{1}{2}\big(1 - \mathrm{P}(X=n)\big)\).
    2. Show that \[ \delta = \sum_{r=0}^{n-1}(n-r)\binom{2n}{r}\frac{1}{2^{2n-1}}\,. \]
    3. Show that for \(r > 0\), \[ r\binom{2n}{r} = 2n\binom{2n-1}{r-1}\,. \] Hence show that \[ \delta = \frac{n}{2^{2n}}\binom{2n}{n}\,. \]
  2. Find a similar expression for \(\delta\) in the case \(N = 2n+1\).


Solution:

  1. When \(N = 2n+1\), \(\mu = n +\frac12\) and so \begin{align*} && \delta &= \E[|X-\mu|] \\ &&&= \sum_{i=0}^n (n + \tfrac12 - i) \frac{1}{2^{2n+1}} \binom{2n+1}{i} + \sum_{i=n+1}^{2n+1} (i-n - \tfrac12) \frac{1}{2^{2n+1}} \binom{2n+1}{i} \\ &&&= 2\sum_{i=0}^n (n + \tfrac12 - i) \frac{1}{2^{2n+1}} \binom{2n+1}{i} \\ &&&= \frac{(2n +1)}{2^{2n+1}}\sum_{i=0}^n \binom{2n+1}i - \frac{2}{2^{2n+1}}\sum_{i=0}^n i \binom{2n+1}{i} \\ &&&= \frac{(2n +1)}{2^{2n+1}}\sum_{i=0}^n \binom{2n+1}i - \frac{2}{2^{2n+1}}\sum_{i=1}^n (2n+1) \binom{2n}{i-1} \\ &&&= \frac{(2n +1)}{2^{2n+1}}2^{2n} - \frac{2(2n+1)}{2^{2n+1}}\sum_{i=0}^{n-1} \binom{2n}{i} \\ &&&= \frac{(2n +1)}{2} - \frac{2(2n+1)}{2^{2n+1}} \frac12\left (2^{2n} - \binom{2n}{n} \right) \\ &&&= \frac{2n+1}{2^{2n+1}}\binom{2n}{n} \end{align*}

2014 Paper 1 Q12
D: 1484.0 B: 1441.7

A game in a casino is played with a fair coin and an unbiased cubical die whose faces are labelled \(1, 1, 1, 2, 2\) and \(3.\) In each round of the game, the die is rolled once and the coin is tossed once. The outcome of the round is a random variable \(X\). The value, \(x\), of \(X\) is determined as follows. If the result of the toss is heads then \(x= \vert ks -1\vert\), and if the result of the toss is tails then \(x=\vert k-s\vert\), where \(s\) is the number on the die and \(k\) is a given number. Show that \(\mathbb{E}(X^2) = k +13(k-1)^2 /6\). Given that both \(\mathbb{E}(X^2)\) and \(\mathbb{E}(X)\) are positive integers, and that \(k\) is a single-digit positive integer, determine the value of \(k\), and write down the probability distribution of \(X\). A gambler pays \(\pounds 1\) to play the game, which consists of two rounds. The gambler is paid:

  • \(\pounds w\), where \(w\) is an integer, if the sum of the outcomes of the two rounds exceeds \(25\);
  • \(\pounds 1\) if the sum of the outcomes equals \(25\);
  • nothing if the sum of the outcomes is less that \(25\).
Find, in terms of \(w\), an expression for the amount the gambler expects to be paid in a game, and deduce the maximum possible value of \(w\), given that the casino's owners choose \(w\) so that the game is in their favour.


Solution: \begin{align*} && \mathbb{E}(X^2) &= \frac12 \left (\frac16 \left ( 3(k -1)^2+2(2k-1)^2+(3k-1)^2 \right) +\frac16 \left ( 3(k -1)^2+2(k-2)^2+(k-3)^2 \right) \right) \\ &&&= \frac12 \left (\frac16 \left (20k^2-20k+6 \right) + \frac16 \left ( 6k^2-20k+20\right) \right) \\ &&&= \frac1{12} \left (26k^2-40k+ 26\right) \\ &&&= \frac{13}{6} (k^2+1) - \frac{10}{3}k \\ &&&= \frac{13}{6}(k-1)^2+k \end{align*} Since \(k\) a single digit positive number and \(\mathbb{E}(X^2)\) is an integer, \(6 \mid k-1 \Rightarrow k = 1, 7\). \begin{align*} \mathbb{E}(X | k=1) &= \frac12 \left (\frac16 \left ( 2+2 \right) +\frac16 \left ( 2+2 \right) \right) = \frac23 \not \in \mathbb{Z}\\ \mathbb{E}(X | k=7) &= \frac12 \left (\frac16 \left ( 3\cdot6+2\cdot13+20 \right) +\frac16 \left ( 3\cdot6+2\cdot5+4 \right) \right) = 8 \end{align*} Therefore \(k = 7\) The probability distribution is \begin{align*} && \mathbb{P}(X=4) = \frac1{12} \\ && \mathbb{P}(X=5) = \frac1{6} \\ && \mathbb{P}(X=6) = \frac12 \\ && \mathbb{P}(X=13) = \frac1{6} \\ && \mathbb{P}(X=20)= \frac1{12} \\ \end{align*} The only ways to score more than \(25\) are: \(20+6, 20+13, 20+20, 13+13\) The only ways to score exactly \(25\) are \(20+5\) \begin{align*} \mathbb{P}(>25) &= \frac1{12} \cdot\left(2\cdot \frac12+2\cdot\frac16+\frac1{12}\right) + \frac{1}{6^2} \\ &= \frac{7}{48} \\ \mathbb{P}(=25) &= \frac{2}{12 \cdot 6} = \frac{1}{36} \\ \\ \mathbb{E}(\text{payout}) &= \frac{7}{48}w + \frac{1}{36} = \frac{21w+4}{144} \end{align*} The casino needs \(\frac{21w+4}{144} < 1 \Rightarrow 21w< 140 \Rightarrow w < \frac{20}{3}\)

2009 Paper 3 Q12
D: 1700.0 B: 1516.0

  1. Albert tosses a fair coin \(k\) times, where \(k\) is a given positive integer. The number of heads he gets is \(X_1\). He then tosses the coin \(X_1\) times, getting \(X_2\) heads. He then tosses the coin \(X_2\) times, getting \(X_3\) heads. The random variables \(X_4\), \(X_5\), \(\ldots\) are defined similarly. Write down \(\E(X_1)\). By considering \(\E(X_2 \; \big\vert \; X_1 = x_1)\), or otherwise, show that \(\E(X_2) = \frac14 k\). Find \(\displaystyle \sum_{i=1}^\infty \E(X_i)\).
  2. Bertha has \(k\) fair coins. She tosses the first coin until she gets a tail. The number of heads she gets before the first tail is \(Y_1\). She then tosses the second coin until she gets a tail and the number of heads she gets with this coin before the first tail is \(Y_2\). The random variables \(Y_3, Y_4, \ldots\;\), \(Y_k\) are defined similarly, and \(Y= \sum\limits_{i=1}^k Y_i\,\). Obtain the probability generating function of \(Y\), and use it to find \(\E(Y)\), \(\var(Y)\) and \(\P(Y=r)\).


Solution:

  1. \(X_1 \sim B(k, \tfrac12)\) so \(\E[X_1] = \frac{k}{2}\) Note that \(X_2 | X_1 = x_1 \sim B(x_1, \tfrac12)\) so \(\E[X_2 | X_1 = x_1) = \frac{x_1}{2}\) or \(\E[X_2 | X_1] = \frac12 X_1\). Therefore by the tower law, \(\E[\E[X_2|X_1]] = \E[\frac12 X_1] = \frac14k\) Notice also that \(\E[X_n] = \frac1{2^n} k\) and so \begin{align*} && \sum_{i=1}^\infty \E[X_i] &= \sum_{i=1}^{\infty} \frac1{2^i} k \\ &&&= \frac{\frac12 k}{1-\frac12} = k \end{align*}
  2. Note that \(Y_1 \sim Geo(\tfrac12)-1\) which has generating function \(\E[t^{Y_1}] = \E[t^{G-1}] = \frac{\frac12 t}{1-(1-\frac12)t}\frac1{t} = \frac{\frac12}{1-\frac12t}\). Notice that \begin{align*} && \E \left [ t^Y \right] &= \E \left [ t^{\sum_{i=1}^kY_i} \right] \\ &&&= \prod_{i=1}^k \E[t^{Y_i}] \\ &&&= \frac{1}{(2-t)^k} \end{align*} Therefore \(\E[Y] = G'(1) = k(2-1)^{-(k+1)} = k\) \(\E[Y^2] = (tG'(t))'|_{t=1} = k(k+1)(2-1)^{-(k+2)}+k(2-1)^{-(k+1)} = k^2+2k\) so \(\var[Y] = k^2+2k - k^2 =2 k\). Finally \(\mathbb{P}(Y=r) = \binom{k+r-1}{k} \frac{1}{2^{r+k}}\)
[Note: this second distribution is a negative binomial distribution]

2001 Paper 3 Q13
D: 1700.0 B: 1500.0

In a game for two players, a fair coin is tossed repeatedly. Each player is assigned a sequence of heads and tails and the player whose sequence appears first wins. Four players, \(A\), \(B\), \(C\) and \(D\) take turns to play the game. Each time they play, \(A\) is assigned the sequence TTH (i.e.~Tail then Tail then Head), \(B\) is assigned THH, \(C\) is assigned HHT and \(D\) is assigned~HTT.

  1. \(A\) and \(B\) play the game. Let \(p_{\mathstrut\mbox{\tiny HH}}\), \(p_{\mathstrut\mbox{\tiny HT}}\), \(p_{\mathstrut\mbox{\tiny TH}}\) and \(p_{\mathstrut\mbox{\tiny TT}}\) be the probabilities of \(A\) winning the game given that the first two tosses of the coin show HH, HT, TH and TT, respectively. Explain why \(p_{\mathstrut\mbox{\tiny TT}} = 1\,\), and why $p_{\mathstrut\mbox{\tiny HT}} = {1 \over 2} \, p_{\mathstrut\mbox{\tiny TH}} + {1\over 2} \, p_{\mathstrut\mbox{\tiny TT}}\,$. Show that $p_{\mathstrut\mbox{\tiny HH}} = p_{\mathstrut\mbox{\tiny HT}} = {2 \over 3}$ and that \(p_{\mathstrut\mbox{\tiny TH}} = {1\over 3}\,\). Deduce that the probability that A wins the game is \({2\over 3}\,\).
  2. \(B\) and \(C\) play the game. Find the probability that \(B\) wins.
  3. Show that if \(C\) plays \(D\), then \(C\) is more likely to win than \(D\), but that if \(D\) plays \(A\), then \(D\) is more likely to win than \(A\).

1991 Paper 1 Q15
D: 1516.0 B: 1484.0

A fair coin is thrown \(n\) times. On each throw, 1 point is scored for a head and 1 point is lost for a tail. Let \(S_{n}\) be the points total for the series of \(n\) throws, i.e. \(S_{n}=X_{1}+X_{2}+\cdots+X_{n},\) where \[ X_{j}=\begin{cases} 1 & \text{ if the }j \text{ th throw is a head}\\ -1 & \text{ if the }j\text{ th throw is a tail.} \end{cases} \]

  1. If \(n=10\,000,\) find an approximate value for the probability that \(S_{n}>100.\)
  2. Find an approximate value for the least \(n\) for which \(\mathrm{P}(S_{n}>0.01n)<0,01.\)
Suppose that instead no points are scored for the first throw, but that on each successive throw, 2 points are scored if both it and the first throw are heads, two points are deducted if both are tails, and no points are scored or lost if the throws differ. Let \(Y_{k}\) be the score on the \(k\)th throw, where \(2\leqslant k\leqslant n.\) Show that \(Y_{k}=X_{1}+X_{k}.\) Calculate the mean and variance of each \(Y_{k}\) and determine whether it is true that \[ \mathrm{P}(Y_{2}+Y_{3}+\cdots+Y_{n}>0.01(n-1))\rightarrow0\quad\mbox{ as }n\rightarrow\infty. \]


Solution: Notice that \(\mathbb{E}(X_i) = 0, \mathbb{E}(X_i^2) = 1\) and so \(\mathbb{E}(S_n) =0, \textrm{Var}(S_n) = n\).

  1. Then by the central limit theorem (or alternatively the normal approximation to the binomial), \begin{align*} && \mathbb{P}(S_n > 100) &\underbrace{\approx}_{\text{CLT}} \mathbb{P} \left (Z > \frac{100}{\sqrt{10\, 000}} \right) \\ &&&= \mathbb{P}(Z > 1) \\ &&&= 1-\Phi(1) \\ &&&\approx 15.9\% \end{align*}
  2. \begin{align*} &&\mathbb{P}(S_n > 0.01n) &\approx \mathbb{P} \left (Z > \frac{0.01n}{\sqrt{n}} \right) \\ &&&= \mathbb{P}(Z > 0.01 \sqrt{n}) \\ &&&= 1-\Phi(0.01\sqrt{n}) \\ &&&< 0.01 \\ && \Phi^{-1}(0.01) &= -2.3263\ldots \\ \Rightarrow && 0.01 \sqrt{n} &= 2.3263\ldots \\ \Rightarrow && n &\approx 233^2 \end{align*}
\begin{array}{cc|cc} 1\text{st throw}& k\text{th throw} & X_1 + X_k & Y_k \\ \hline \text{head} & \text{head} & 1 + 1 & 2 \\ \text{head} & \text{tail} & 1 - 1 & 0 \\ \text{tail} & \text{head} & -1 + 1 & 0 \\ \text{tail} & \text{tail} & -1- 1 & -2 \\ \end{array} Across all possible cases \(Y_k = X_1 + X_k\) so therefore these random variables are equal. \begin{align*} \mathbb{E}(Y_k) &= \mathbb{E}(X_1) + \mathbb{E}(Y_k) \\ &= 0 + 0 = 0 \\ \\ \textrm{Var}(Y_k) &= \textrm{Var}(X_1)+\textrm{Var}(X_k) \\ &= 2 \\ \\ \mathbb{E}\left (\sum_{k=2}^n Y_k \right) &= 0 \\ \textrm{Var}\left (\sum_{k=2}^n Y_k \right) &= 2(n-1) \end{align*} Therefore approximately \(\displaystyle \sum_{k=2}^n Y_k \approx N(0, 2(n-1))\) \begin{align*} \mathbb{P} \left (\sum_{k=2}^n Y_k > 0.01(n-1) \right) &\approx \mathbb{P} \left (Z > \frac{0.01(n-1)}{\sqrt{2(n-1)}} \right) \\ &= \mathbb{P} \left (Z > c \sqrt{n-1} \right) \\ &\to 0 \text{ as } n \to \infty \end{align*}