Problems

Filters
Clear Filters

33 problems found

2025 Paper 2 Q12
D: 1500.0 B: 1500.0

Let \(X\) be a Poisson random variable with mean \(\lambda\) and let \(p_r = P(X = r)\), for \(r = 0, 1, 2, \ldots\). Neither \(\lambda\) nor \(\lambda + \frac{1}{2} + \sqrt{\lambda + \frac{1}{4}}\) is an integer.

  1. Show, by considering the sequence \(d_r \equiv p_r - p_{r-1}\) for \(r = 1, 2, \ldots\), that there is a unique integer \(m\) such that \(P(X = r) \leq P(X = m)\) for all \(r = 0, 1, 2, \ldots\), and that \[\lambda - 1 < m < \lambda.\]
  2. Show that the minimum value of \(d_r\) occurs at \(r = k\), where \(k\) is such that \[k < \lambda + \frac{1}{2} + \sqrt{\lambda + \frac{1}{4}} < k + 1.\]
  3. Show that the condition for the maximum value of \(d_r\) to occur at \(r = 1\) is \[1 < \lambda < 2 + \sqrt{2}.\]
  4. In the case \(\lambda = 3.36\), sketch a graph of \(p_r\) against \(r\) for \(r = 0, 1, 2, \ldots, 6, 7\).


Solution:

  1. Suppose \(d_r = p_r - p_{r-1}\) then \begin{align*} d_r &= p_r - p_{r-1} \\ &= \mathbb{P}(X = r) - \mathbb{P}(X = r-1) \\ &= e^{-\lambda} \left ( \frac{\lambda^r}{r!} - \frac{\lambda^{r-1}}{(r-1)!} \right) \\ &= e^{-\lambda} \frac{\lambda^{r-1}}{(r-1)!} \left ( \frac{\lambda}{r} - 1\right) \end{align*} Therefore \(d_r > 0 \Leftrightarrow \lambda > r\)ie, \(p_r\) is increasing while \(r < \lambda\) and reaches a (unique) maximum when \(r = \lfloor \lambda \rfloor\).
  2. Let \(dd_r = d_r - d_{r-1}\), so: \begin{align*} dd_r &= d_r - d_{r-1} \\ &= p_r - 2p_{r-1} + p_{r-2} \\ &= e^{-\lambda} \frac{\lambda^{r-2}}{r!} \left ( \lambda^2 - 2 \lambda r + r(r-1)\right ) \end{align*} Therefore \(dd_r < 0 \Leftrightarrow \lambda^2 - 2\lambda r +r(r-1) < 0 \Leftrightarrow r^2 -(1+2\lambda)r + \lambda^2 < 0\), but this has roots \(r = \frac{(1+2\lambda) \pm \sqrt{(1+2\lambda)^2-4\lambda^2}}{2} = \lambda + \frac12 \pm \sqrt{\lambda + \frac14}\). Therefore \(d_r\) is decreasing when \(r \in \left (\lambda + \frac12 -\sqrt{\lambda + \frac14},\lambda + \frac12 + \sqrt{\lambda + \frac14} \right)\), therefore the possible minimums are \(d_1\) and \(d_k\) where \(k < \lambda + \frac{1}{2} + \sqrt{\lambda + \frac{1}{4}} < k + 1\). \(d_1 = e^{-\lambda}(\lambda - 1)\), \(d_k = e^{-\lambda} \frac{\lambda^{k-1}}{(k-1)!}(\frac{\lambda}{k}-1)\)
  3. If the maximum value of \(d_r\) is \(r = 1\) then \(d_r\) must be decreasing, ie considering \(dd_2\) we have \(\lambda^2 -4\lambda + 2< 0 \Leftrightarrow 2 - \sqrt{2} < \lambda < 2 + \sqrt{2}\). It must also be the case that it doesn't get beaten as \(\lambda \to \infty\). In this case \(d_r \to 0\), so we need \(d_1 > 0\), ie \(\lambda > 1\). Therefore \(1 < \lambda < 2 + \sqrt{2}\)
  4. TikZ diagram

2023 Paper 3 Q11
D: 1500.0 B: 1500.0

Show that \[\sum_{k=1}^{\infty} \frac{k+1}{k!}\, x^k = (x+1)\mathrm{e}^x - 1\,.\] In the remainder of this question, \(n\) is a fixed positive integer.

  1. Random variable \(Y\) has a Poisson distribution with mean \(n\). One observation of \(Y\) is taken. Random variable \(D\) is defined as follows. If the observed value of \(Y\) is zero then \(D = 0\). If the observed value of \(Y\) is \(k\), where \(k \geqslant 1\), then a fair \(k\)-sided die (with sides numbered \(1\) to \(k\)) is rolled once and \(D\) is the number shown on the die.
    1. Write down \(\mathrm{P}(D = 0)\).
    2. Show, from the definition of the expectation of a random variable, that \[\mathrm{E}(D) = \sum_{d=1}^{\infty} \left[ d \sum_{k=d}^{\infty} \left( \frac{1}{k} \cdot \frac{n^k}{k!}\, \mathrm{e}^{-n} \right) \right].\] Show further that \[\mathrm{E}(D) = \sum_{k=1}^{\infty} \left( \frac{1}{k} \cdot \frac{n^k}{k!}\, \mathrm{e}^{-n} \sum_{d=1}^{k} d \right).\]
    3. Show that \(\mathrm{E}(D) = \frac{1}{2}(n + 1 - \mathrm{e}^{-n})\).
  2. Random variables \(X_1, X_2, \ldots, X_n\) all have Poisson distributions. For each \(k \in \{1, 2, \ldots, n\}\), the mean of \(X_k\) is \(k\). A fair \(n\)-sided die, with sides numbered \(1\) to \(n\), is rolled. When \(k\) is the number shown, one observation of \(X_k\) is recorded. Let \(Z\) be the number recorded.
    1. Find \(\mathrm{P}(Z = 0)\).
    2. Show that \(\mathrm{E}(Z) > \mathrm{E}(D)\).

2019 Paper 3 Q11
D: 1500.0 B: 1500.0

The number of customers arriving at a builders' merchants each day follows a Poisson distribution with mean \(\lambda\). Each customer is offered some free sand. The probability of any given customer taking the free sand is \(p\).

  1. Show that the number of customers each day who take sand follows a Poisson distribution with mean \(p\lambda\).
  2. The merchant has a mass \(S\) of sand at the beginning of the day. Each customer who takes the free sand gets a proportion \(k\) of the remaining sand, where \(0 \leq k < 1\). Show that by the end of the day the expected mass of sand taken is $$\left(1 - e^{-kp\lambda}\right)S.$$
  3. At the beginning of the day, the merchant's bag of sand contains a large number of grains, exactly one of which is made from solid gold. At the end of the day, the merchant's assistant takes a proportion \(k\) of the remaining sand. Find the probability that the assistant takes the golden grain. Comment on the case \(k = 0\) and on the limit \(k \to 1\). In the case \(p\lambda > 1\) find the value of \(k\) which maximises the probability that the assistant takes the golden grain.


Solution:

  1. Let \(X\) be the number of people arriving on a given day, and \(Y\) be the number taking sand, then \begin{align*} && \mathbb{P}(Y = k) &= \sum_{x=k}^{\infty} \mathbb{P}(x \text{ arrive and }k\text{ of them take sand}) \\ &&&= \sum_{x=k}^{\infty} \mathbb{P}(X=x)\mathbb{P}(k \text{ out of }x\text{ of them take sand})\\ &&&= \sum_{x=k}^{\infty} e^{-\lambda} \frac{\lambda^x}{x!}\binom{x}{k}p^k(1-p)^{x-k}\\ &&&= e^{-\lambda} \left ( \frac{p}{1-p} \right)^k \sum_{x=k}^{\infty} \frac{((1-p)\lambda)^x}{k!(x-k)!} \\ &&&= e^{-\lambda} \left ( \frac{p}{1-p} \right)^k \frac{((1-p)\lambda)^k}{k!} \sum_{x=0}^{\infty} \frac{((1-p)\lambda)^x}{x!} \\ &&&= e^{-\lambda} \left ( \frac{p}{1-p} \right)^k \frac{((1-p)\lambda)^k}{k!}e^{(1-p)\lambda)} \\ &&&= e^{-p\lambda} \frac{(p\lambda)^k}{k!} \end{align*} which is precisely a Poisson with parameter \(p\lambda\). Alternatively, \(Y = B_1 + B_2 + \cdots + B_X\) where \(B_i \sim Bernoulli(p)\) so \(G_Y(t) = G_X(G_B(t)) = G_X(1-p+pt) = e^{-\lambda(1-(1-p+pt))} = e^{-p\lambda(1-t)}\) so \(Y \sim Po(\lambda)\) Alternatively, alternatively, let \(Z\) be the number of people not taking sand, so \begin{align*} && \mathbb{P}(Y = y, Z= z) &= \mathbb{P}(X=y+z) \cdot \binom{y+z}{y} p^y(1-p)^z \\ &&&= e^{-\lambda} \frac{\lambda^{y+z}}{(y+z)!} \frac{(y+z)!}{y!z!} p^y(1-p)^z \\ &&&=\left ( e^{-p\lambda} \frac{(p\lambda)^y}{y!} \right) \cdot \left ( e^{-(1-p)\lambda} \frac{((1-p)\lambda)^z}{z!}\right) \end{align*} So clearly \(Y\) and \(Z\) are both (independent!) Poisson with parameters \(p\lambda \) and \((1-p)\lambda\)
  2. The amount taken is \(Sk + S(1-k)k + \cdots +Sk(1-k)^{Y-1} = Sk\cdot \frac{1-(1-k)^Y}{k} = S(1-(1-k)^Y)\) so \begin{align*} \E[\text{taken sand}] &= \E \left [ S(1-(1-k)^Y)\right] \\ &= S-S\E\left [(1-k)^Y \right] \\ &= S - SG_Y(1-k)\\ &=S - Se^{-p\lambda(1-(1-k))} \tag{pgf for Poisson} \\ &= S\left (1-e^{-kp\lambda} \right) \end{align*}
  3. The fraction of grains the assistant takes home is: \((1-k)^Yk\), which has expected value \(ke^{-kp\lambda}\). This the the probability he takes home the golden grain. When \(k = 0\) the probability is \(0\) which makes sense (no-one takes home any sand, including the merchant's assistant). As \(k \to 1\) we get \(e^{-p\lambda}\) which is the probability that no-one gets any sand other than him. \begin{align*} && \frac{\d }{\d k} \left ( ke^{-kp\lambda} \right) &= e^{-kp\lambda} - (p\lambda)ke^{-kp\lambda} \\ &&&= e^{-kp\lambda}(1 - (p\lambda)k) \end{align*} Therefore maximised at \(k = \frac{1}{p\lambda}\). (Clearly this is a maximum just by sketching the function)

2018 Paper 3 Q13
D: 1700.0 B: 1484.0

The random variable \(X\) takes only non-negative integer values and has probability generating function \(\G(t)\). Show that \[ \P(X = 0 \text{ or } 2 \text{ or } 4 \text { or } 6 \ \ldots ) = \frac{1}{2}\big(\G\left(1\right)+\G\left(-1\right)\big). \] You are now given that \(X\) has a Poisson distribution with mean \(\lambda\). Show that \[ \G(t) = \e^{-\lambda(1-t)} \,. \]

  1. The random variable \(Y\) is defined by \[ \P(Y=r)= \begin{cases} k\P(X=r) & \text{if \(r=0, \ 2, \ 4, \ 6, \ \ldots\) \ }, \\[2mm] 0& \text{otherwise}, \end{cases} \] where \(k\) is an appropriate constant. Show that the probability generating function of \(Y\) is \(\dfrac{\cosh\lambda t}{\cosh\lambda}\,\). Deduce that \(\E(Y) < \lambda\) for \(\lambda > 0\,\).
  2. The random variable \(Z\) is defined by \[\P(Z=r)= \begin{cases} c \P(X=r) & \text{if \(r = 0, \ 4, \ 8, \ 12, \ \ldots \ \)}, \\[2mm] 0& \text{otherwise,} \end{cases} \] where \(c\) is an appropriate constant. Is \(\E(Z) < \lambda\) for all positive values of \(\lambda\,\)?


Solution: \begin{align*} &&G_X(t) &= \mathbb{E}(t^N) \\ &&&= \sum_{k=0}^{\infty} \mathbb{P}(X = k) t^k \\ \Rightarrow && G_X(1) &= \sum_{k=0}^{\infty} \mathbb{P}(X = k) \\ \Rightarrow && G_X(-1) &= \sum_{k=0}^{\infty} (-1)^k\mathbb{P}(X = k) \\ \Rightarrow && \frac12 (G_X(1) + G_X(-1) &= \sum_{k=0}^{\infty} \frac12 (1 + (-1)^k) \mathbb{P}(X = k) \\ &&&= \sum_{k=0}^{\infty} \mathbb{P}(X =2k) \end{align*}

  1. \begin{align*} 1 &= \sum_r \mathbb{P}(Y = r) \\ &= \sum_{k=0}^\infty k \cdot \mathbb{P}(X = 2k) \\ &= k \cdot \frac12 \l e^{-\lambda(1-1) } + e^{-\lambda(1+1) }\r \\ &= \frac{k}{2}(1+e^{-2\lambda}) \end{align*} Therefore \(k = \frac{2}{1+e^{-2\lambda}} = e^{\lambda} \frac{1}{\cosh \lambda}\) \begin{align*} && G_X(t) + G_X(-t) &= \sum_{k=0}^\infty \mathbb{P}(X = k)t^k(1^k + (-1)^k) \\ &&&= \sum_{k=0}^\infty \mathbb{P}(X = k)t^k(1^k + (-1)^k) \\ &&&= 2\sum_{k=0}^\infty \mathbb{P}(X = 2k)t^{2k} \\ &&&= 2\sum_{k=0}^\infty \frac{1}{k}\mathbb{P}(Y = 2k)t^{2k} \\ &&&= \frac{2}{k}G_Y(t) \\ \Rightarrow && G_Y(t) &= k \cdot \frac{G_X(t) + G_X(-t)}{2} \\ &&&= k\frac{e^{-\lambda(1-t)} + e^{-\lambda(1+t)}}{2} \\ &&&= \frac{e^\lambda}{\cosh \lambda} \frac{e^{-\lambda} (e^{\lambda t}+e^{-\lambda t}) }{2} \\ &&&= \frac{\cosh \lambda t}{\cosh \lambda} \end{align*} Since \(\mathbb{E}(Y) = G_Y'(1)\) and \begin{align*} && G_Y'(t) &= \frac{\lambda \sinh \lambda t}{\cosh \lambda t} \\ \Rightarrow && G_Y'(1) &= \lambda \tanh \lambda \\ &&&< \lambda \end{align*} since \(\tanh x < 1\)
  2. \begin{align*} && \frac14 \l G_X(t) + G_X(it) +G_X(-t) + G_X(-it) \r &= \sum_{k=0}^\infty \mathbb{P}(X=k)t^k (1 + i^k + (-1)^k + (-i)^k) \\ &&&= \sum_{k=0}^\infty \mathbb{P}(X = 4k)t^{4k} \\ &&&= \frac{G_Z(t)}{c} \end{align*} Since \(G_Z(1) = 1\) we must have \(c = \frac1{\frac14 \l G_X(1) + G_X(i) +G_X(-1) + G_X(-i) \r}\) \begin{align*} && c &= \frac{4e^{\lambda}}{e^{\lambda} + e^{-\lambda} + e^{i\lambda} + e^{-i\lambda}} \\ &&&= \frac{2e^{\lambda}}{\cosh \lambda + \cos \lambda} \\ && G_Z(t) &= c \cdot \frac14 \l e^{-\lambda(1-t)}+e^{-\lambda(1-it)}+e^{-\lambda(1+t)}+e^{-\lambda(1+it)} \r \\ &&&= \frac{ce^{-\lambda t}}{4} \l 2\cosh \lambda t + 2 \cos \lambda t\r \\ &&&= \frac{\cosh \lambda t + \cos \lambda t}{\cosh \lambda + \cos \lambda} \end{align*} We are interested in \(G_Z'(1)\) so: \begin{align*} && G_Z'(t) &= \frac{\lambda (\sinh \lambda t - \sin \lambda t)}{\cosh \lambda + \cos \lambda } \end{align*} Considering various values of \(\lambda\), it makes sense to look at \(\lambda = \pi\) (since \(\cos \lambda = -1\) and the denominator will be small). From this we can see: \begin{align*} G'_Z(1) &= \frac{\pi (\sinh \pi-0)}{\cosh \pi-1} \\ &= \frac{\pi}{\tanh \frac{\pi}{2}} > \pi \end{align*} So \(\mathbb{E}(Z)\) is larger than \(\lambda\) for \(\lambda = \pi\) (and probably many others)

2017 Paper 2 Q12
D: 1600.0 B: 1563.6

Adam and Eve are catching fish. The number of fish, \(X\), that Adam catches in any time interval is Poisson distributed with parameter \(\lambda t\), where \(\lambda\) is a constant and \(t\) is the length of the time interval. The number of fish, \(Y\), that Eve catches in any time interval is Poisson distributed with parameter \(\mu t\), where \(\mu\) is a constant and \(t\) is the length of the time interval The two Poisson variables are independent. You may assume that the expected time between Adam catching a fish and Adam catching his next fish is \(\lambda^{-1}\), and similarly for Eve.

  1. By considering \(\P( X + Y = r)\), show that the total number of fish caught by Adam and Eve in time \(T\) also has a Poisson distribution.
  2. Given that Adam and Eve catch a total of \(k\) fish in time \(T\), where \(k\) is fixed, show that the number caught by Adam has a binomial distribution.
  3. Given that Adam and Eve start fishing at the same time, find the probability that the first fish is caught by Adam.
  4. Find the expected time from the moment Adam and Eve start fishing until they have each caught at least one fish.
[Note This question has been redrafted to make the meaning clearer.]


Solution:

  1. \(\,\) \begin{align*} && \mathbb{P}(X+Y=r) &= \sum_{k=0}^r \mathbb{P}(X = k, Y = r-k) \\ &&&= \sum_{k=0}^r \mathbb{P}(X = k)\mathbb{P}( Y = r-k) \\ &&&= \sum_{k=0}^r \frac{e^{-\lambda T} (\lambda T)^k}{k!}\frac{e^{-\mu T} (\mu T)^{r-k}}{(r-k)!}\\ &&&= \frac{e^{-(\mu+\lambda)T}}{r!}\sum_{k=0}^r \binom{r}{k}(\lambda T)^k (\mu T)^{r-k}\\ &&&= \frac{e^{-(\mu+\lambda)T}((\mu+\lambda)T)^r}{r!} \end{align*} Therefore \(X+Y \sim Po \left ( (\mu+\lambda)T \right)\)
  2. \(\,\) \begin{align*} && \mathbb{P}(X = r | X+Y = k) &= \frac{\mathbb{P}(X=r, Y = k-r)}{\mathbb{P}(X+Y=k)} \\ &&&= \frac{\frac{e^{-\lambda T} (\lambda T)^r}{r!}\frac{e^{-\mu T} (\mu T)^{k-r}}{(k-r)!}}{\frac{e^{-(\mu+\lambda)T}((\mu+\lambda)T)^k}{k!}} \\ &&&= \binom{k}{r} \left ( \frac{\lambda}{\lambda + \mu} \right)^r \left ( \frac{\mu}{\lambda + \mu} \right)^{k-r} \end{align*} Therefore \(X|X+Y=k \sim B(k, \frac{\lambda}{\lambda + \mu})\)
  3. \(P(X=1|X+Y = 1) = \frac{\lambda}{\lambda + \mu}\)
  4. Let \(X_1, Y_1\) be the time to the first fish are caught by Adam and Eve, then \begin{align*} && \mathbb{P}(X_1, Y_1 > t) &= \mathbb{P}(X_1> t) \mathbb{P}( Y_1 > t) \\ &&&= e^{-\lambda t}e^{-\mu t} \\ &&&= e^{-(\lambda+\mu)t} \\ \Rightarrow && f_{\max(X_1,Y_1)}(t) &= (\lambda+\mu)e^{-(\lambda+\mu)} \end{align*} Therefore the expected time is \(\frac1{\mu+\lambda}\)

2016 Paper 2 Q13
D: 1600.0 B: 1516.0

  1. The random variable \(X\) has a binomial distribution with parameters \(n\) and \(p\), where \(n=16\) and \(p=\frac12\). Show, using an approximation in terms of the standard normal density function $\displaystyle \tfrac{1}{\sqrt{2\pi}} \, \e ^{-\frac12 x^2} $, that \[ \P(X=8) \approx \frac 1{2\sqrt{2\pi}} \,. \]
  2. By considering a binomial distribution with parameters \(2n\) and \(\frac12\), show that \[ (2n)! \approx \frac {2^{2n} (n!)^2}{\sqrt{n\pi}} \,. \]
  3. By considering a Poisson distribution with parameter \(n\), show that \[ n! \approx \sqrt{2\pi n\, } \, \e^{-n} \, n^n \,. \]


Solution:

  1. \(X \sim B(16, \tfrac12)\), then \(X \approx N(8, 2^2)\), in particular \begin{align*} && \mathbb{P}(X = 8) &\approx \mathbb{P} \left ( 8 - \frac12 \leq 2Z + 8 \leq 8 + \frac12 \right) \\ &&&= \mathbb{P} \left (-\frac14 \leq Z \leq \frac14 \right) \\ &&&= \int_{-\frac14}^{\frac14} \frac{1}{\sqrt{2 \pi}}e^{-\frac12 x^2} \d x \\ &&&\approx \frac{1}{\sqrt{2\pi}} \int_{-\frac14}^{\frac14} 1\d x\\ &&&= \frac{1}{2 \sqrt{2\pi}} \end{align*}
  2. Suppose \(X \sim B(2n, \frac12)\) then \(X \approx N(n, \frac{n}{2})\), and \begin{align*} && \mathbb{P}(X = n) &\approx \mathbb{P} \left ( n - \frac12 \leq \sqrt{\frac{n}{2}} Z + n \leq n + \frac12 \right) \\ &&&= \mathbb{P} \left ( - \frac1{\sqrt{2n}} \leq Z \leq \frac1{\sqrt{2n}}\right) \\ &&&= \int_{-\frac1{\sqrt{2n}}}^{\frac1{\sqrt{2n}}} \frac{1}{\sqrt{2 \pi}} e^{-\frac12 x^2} \d x \\ &&&\approx \frac{1}{\sqrt{n\pi}}\\ \Rightarrow && \binom{2n}{n}\frac1{2^n} \frac{1}{2^n} & \approx \frac{1}{\sqrt{n \pi}} \\ \Rightarrow && (2n)! &\approx \frac{2^{2n}(n!)^2}{\sqrt{n\pi}} \end{align*}
  3. \(X \sim Po(n)\), then \(X \approx N(n, (\sqrt{n})^2)\), therefore \begin{align*} && \mathbb{P}(X = n) &\approx \mathbb{P} \left (-\frac12 \leq \sqrt{n} Z \leq \frac12 \right) \\ &&&= \int_{-\frac{1}{2 \sqrt{n}}}^{\frac{1}{2 \sqrt{n}}} \frac{1}{\sqrt{2\pi}}e^{-\frac12 x^2} \d x \\ &&&\approx \frac{1}{\sqrt{2 \pi n}} \\ \Rightarrow && e^{-n} \frac{n^n}{n!} & \approx \frac{1}{\sqrt{2 \pi n}} \\ \Rightarrow && n! &\approx \sqrt{2 \pi n} e^{-n}n^n \end{align*}

2016 Paper 3 Q12
D: 1700.0 B: 1516.0

Let \(X\) be a random variable with mean \(\mu\) and standard deviation \(\sigma\). Chebyshev's inequality, which you may use without proof, is \[ \P\left(\vert X-\mu\vert > k\sigma\right) \le \frac 1 {k^2} \,, \] where \(k\) is any positive number.

  1. The probability of a biased coin landing heads up is \(0.2\). It is thrown \(100n\) times, where \(n\) is an integer greater than 1. Let \(\alpha \) be the probability that the coin lands heads up \(N\) times, where \(16n \le N \le 24n\). Use Chebyshev's inequality to show that \[ \alpha \ge 1-\frac 1n \,. \]
  2. Use Chebyshev's inequality to show that \[ 1+ n + \frac{n^2}{ 2!} + \cdots + \frac {n^{2n}}{(2n)!} \ge \left(1-\frac1n\right) \e^n \,. \]


Solution:

  1. Let \(N\) be the number of times the coin lands heads up, ie \(N \sim Binomial(100n, 0.2)\), then \(\mathbb{E}(N) = \mu = 20n, \mathrm{Var}(N) = \sigma^2 = 100n \cdot 0.2 \cdot 0.8 = 16n \Rightarrow \sigma = 4\sqrt{n}\). \begin{align*} && \mathbb{P}(|X - \mu| > k\sigma) &\leq \frac{1}{k^2} \\ \Rightarrow && 1 - \mathbb{P}(|X - \mu| \leq k\sigma) &\leq \frac1{k^2} \\ \Rightarrow && 1 - \mathbb{P}(|X - 20n| \leq \sqrt{n} \cdot 4\sqrt{n}) &\leq \frac1{{\sqrt{n}}^2} \\ \Rightarrow && 1 - \mathbb{P}(16n \leq N \leq 24n) &\leq \frac{1}{n} \\ \Rightarrow && 1 - \frac1n &\leq \alpha \end{align*}
  2. Suppose \(X \sim Pois(n)\), then \(\mathbb{E}(X) = n, \mathrm{Var}(X) = n\). Therefore \begin{align*} && \mathbb{P}(|X - \mu| > k\sigma) &\leq \frac{1}{k^2} \\ \Rightarrow && 1-\mathbb{P}(|X - n| \leq \sqrt{n} \cdot \sqrt{n}) &> \frac{1}{\sqrt{n}^2} \\ \Rightarrow && 1 - \sum_{i=0}^{2n} \mathbb{P}(X = i) & \leq \frac{1}{n} \\ \Rightarrow && \sum_{i=0}^{2n} e^{-n} \frac{n^i}{i!} \geq 1 - \frac{1}{n} \\ \Rightarrow && \sum_{i=0}^{2n} \frac{n^i}{i!} \geq \left ( 1 - \frac1n \right)e^n \end{align*}

2015 Paper 1 Q12
D: 1500.0 B: 1461.6

The number \(X\) of casualties arriving at a hospital each day follows a Poisson distribution with mean 8; that is, \[ \P(X=n) = \frac{ \e^{-8}8^n}{n!}\,, \ \ \ \ n=0, \ 1, \ 2, \ \ldots \ . \] Casualties require surgery with probability \(\frac14\). The number of casualties arriving on any given day is independent of the number arriving on any other day and the casualties require surgery independently of one another.

  1. What is the probability that, on a day when exactly \(n\) casualties arrive, exactly \(r\) of them require surgery?
  2. Prove (algebraically) that the number requiring surgery each day also follows a Poisson distribution, and state its mean.
  3. Given that in a particular randomly chosen week a total of 12 casualties require surgery on Monday and Tuesday, what is the probability that 8 casualties require surgery on Monday? You should give your answer as a fraction in its lowest terms.


Solution:

  1. \(\mathbb{P}(r \text{ need surgery}|n \text{ casualties}) = \binom{n}{r} \left ( \frac14\right)^r \left ( \frac34\right)^{n-r}\)
  2. \(\,\) \begin{align*} && \mathbb{P}(r \text{ need surgery}) &= \sum_{n=r}^{\infty} \mathbb{P}(r \text{ need surgery} |n \text{ casualties}) \mathbb{P}(n \text{ casualties}) \\ &&&= \sum_{n=r}^{\infty} \binom{n}{r}\left ( \frac14\right)^r \left ( \frac34\right)^{n-r} \frac{e^{-8} 8^n}{n!} \\ &&&= \sum_{n=r}^{\infty} \frac{n!}{(n-r)!r!}\left ( \frac14\right)^r \left ( \frac34\right)^{n-r} \frac{e^{-8} 8^n}{n!} \\ &&&= \frac{e^{-8}8^r}{r!}\left ( \frac14\right)^r \sum_{n=r}^{\infty} \frac{8^{n-r}}{(n-r)} \left ( \frac34\right)^{n-r} \\ &&&= \frac{e^{-8}8^r}{r!}\left ( \frac14\right)^r \sum_{n=r}^{\infty} \frac{6^{n-r}}{(n-r)} \\ &&&= \frac{e^{-8}2^r}{r!} e^6 \\ &&&= \frac{e^{-2}2^r}{r!} \end{align*} Therefore the number requiring surgery is \(Po(2)\) with mean \(2\).
  3. \(\,\) \begin{align*} && \mathbb{P}(X_1 = 8| X_1 + X_2 =12) &= \frac{\mathbb{P}(X_1 = 8,X_2 =4)} {\mathbb{P}(X_1+X_2 = 12)}\\ &&&= \frac{\frac{e^{-2}2^8}{8!} \cdot \frac{e^{-2}2^4}{4!}}{\frac{e^{-4}4^{12}}{12!}} \\ &&&= \frac{12!}{8!4!} \frac{1}{2^{12}} \\ &&&= \binom{12}4 \left ( \frac12 \right)^4\left ( \frac12 \right)^8 \\ &&&= \frac{495}{4096} \end{align*}

2013 Paper 2 Q12
D: 1600.0 B: 1484.0

The random variable \(U\) has a Poisson distribution with parameter \(\lambda\). The random variables \(X\) and \(Y\) are defined as follows. \begin{align*} X&= \begin{cases} U & \text{ if \(U\) is 1, 3, 5, 7, \(\ldots\,\)} \\ 0 & \text{ otherwise} \end{cases} \\ Y&= \begin{cases} U & \text{ if \(U\) is 2, 4, 6, 8, \(\ldots\,\) } \\ 0 & \text{ otherwise} \end{cases} \end{align*}

  1. Find \(\E(X)\) and \(\E(Y)\) in terms of \(\lambda\), \(\alpha\) and \(\beta\), where \[ \alpha = 1+\frac{\lambda^2}{2!}+\frac{\lambda^4}{4!} +\cdots\, \text{ \ \ and \ \ } \beta = \frac{\lambda}{1!} + \frac{\lambda^3}{3!} + \frac{\lambda^5}{5!} +\cdots\,. \]
  2. Show that \[ \var(X) = \frac{\lambda\alpha+\lambda^2\beta}{\alpha+\beta} - \frac{\lambda^2\alpha^2}{(\alpha+\beta)^2} \] and obtain the corresponding expression for \(\var(Y)\). Are there any non-zero values of \(\lambda\) for which \( \var(X) + \var(Y) = \var(X+Y)\,\)?


Solution:

  1. \begin{align*} \mathbb{E}(X) &= \sum_{r=1}^\infty r \mathbb{P}(X = r) \\ &= \sum_{j=1}^{\infty} (2j-1)\mathbb{P}(U=2j-1) \\ &= \sum_{j=1}^{\infty}(2j-1) \frac{e^{-\lambda} \lambda^{2j-1}}{(2j-1)!} \\ &= \sum_{j=1}^{\infty} e^{-\lambda} \frac{\lambda^{2j-1}}{(2j-2)!} \\ &= \lambda e^{-\lambda} \sum_{j=1}^{\infty} \frac{\lambda^{2j-2}}{(2j-2)!} \\ &= \lambda e^{-\lambda} \alpha \end{align*} Since \(\mathbb{E}(X+Y) = \lambda, \mathbb{E}(Y) = \lambda(1-e^{-\lambda}\alpha) = \lambda(e^{-\lambda}(\alpha+\beta) - e^{-\lambda}\alpha) = \lambda e^{-\lambda} \beta\). Alternatively, as \(\beta + \alpha = e^{\lambda}\), \(\mathbb{E}(X) = \frac{\lambda \alpha}{\alpha+\beta}, \mathbb{E}(Y) = \frac{\lambda \beta}{\alpha+\beta}\)
  2. \begin{align*} \textrm{Var}(X) &= \mathbb{E}(X^2) - [\mathbb{E}(X) ]^2 \\ &= \sum_{odd} r^2 \mathbb{P}(U = r) - \left [ \mathbb{E}(X) \right]^2 \\ &= \sum_{odd} (r(r-1)+r)\frac{e^{-\lambda}\lambda^r}{r!} - \frac{\lambda^2 \alpha^2}{(\alpha+\beta)^2} \\ &= \sum_{odd} \frac{e^{-\lambda}\lambda^r}{(r-2)!}+\sum_{odd} \frac{e^{-\lambda}\lambda^r}{(r-1)!} - \frac{\lambda^2 \alpha^2}{(\alpha+\beta)^2} \\ &= e^{-\lambda}\lambda^2 \beta + e^{-\lambda}\lambda \alpha - \frac{\lambda^2 \alpha^2}{(\alpha+\beta)^2} \\ &= \frac{\lambda \alpha + \lambda^2 \beta}{\alpha+\beta}- \frac{\lambda^2 \alpha^2}{(\alpha+\beta)^2} \end{align*} Similarly, \begin{align*} \textrm{Var}(Y) &= \mathbb{E}(Y^2) - [\mathbb{E}(Y) ]^2 \\ &= \sum_{even} r^2 \mathbb{P}(U = r) - \left [ \mathbb{E}(Y) \right]^2 \\ &= \sum_{even} (r(r-1)+r)\frac{e^{-\lambda}\lambda^r}{r!} - \frac{\lambda^2 \beta^2}{(\alpha+\beta)^2} \\ &= e^{-\lambda}\lambda^2\alpha + e^{-\lambda}\lambda \beta - \frac{\lambda^2 \beta^2}{(\alpha+\beta)^2} \\ &= \frac{\lambda \beta + \lambda^2 \alpha}{\alpha+\beta}- \frac{\lambda^2 \beta^2}{(\alpha+\beta)^2} \end{align*} Since \(\textrm{Var}(X+Y) = \textrm{Var}(U) = \lambda\), we are interested in solving: \begin{align*} \lambda &= \frac{\lambda \alpha + \lambda^2 \beta}{\alpha+\beta}- \frac{\lambda^2 \alpha^2}{(\alpha+\beta)^2} + \frac{\lambda \beta + \lambda^2 \alpha}{\alpha+\beta}- \frac{\lambda^2 \beta^2}{(\alpha+\beta)^2} \\ &= \frac{\lambda(\alpha+\beta) + \lambda^2(\alpha+\beta)}{\alpha+\beta} - \frac{\lambda^2(\alpha^2+\beta^2)}{(\alpha+\beta)^2} \\ &= \lambda + \lambda^2 \frac{(\alpha+\beta)^2 - (\alpha^2+\beta^2)}{(\alpha+\beta)^2} \\ &= \lambda + \lambda^2 \frac{2\alpha\beta}{(\alpha+\beta)^2} \end{align*} which is clearly not possible if \(\lambda \neq 0\)

2012 Paper 2 Q13
D: 1600.0 B: 1516.0

In this question, you may assume that \(\displaystyle \int_0^\infty \!\!\! \e^{-x^2/2} \d x = \sqrt{\tfrac12 \pi}\,\). The number of supermarkets situated in any given region can be modelled by a Poisson random variable, where the mean is \(k\) times the area of the given region. Find the probability that there are no supermarkets within a circle of radius \(y\). The random variable \(Y\) denotes the distance between a randomly chosen point in the region and the nearest supermarket. Write down \(\P(Y < y)\) and hence show that the probability density function of \(Y\) is \(\displaystyle 2\pi y k \e^{-\pi k y^2}\) for \(y\ge0\). Find \(\E(Y)\) and show that \(\var(Y) = \dfrac{4-\pi}{4\pi k}\).


Solution: A circle radius \(y\) has a number of supermarkets \(X\) where \(X \sim Po(k \pi y^2)\). \[ \mathbb{P}(X = 0) = e^{-k\pi y^2} \frac{1}{0!} = e^{-k\pi y^2} \] The probability \(\mathbb{P}(Y < y) = 1-\mathbb{P}(Y \geq y) = 1-e^{-k\pi y^2}\), and in particular \(f_Y(y) = 2k\pi y e^{-k\pi y^2}\) (by differentiating). \begin{align*} && \mathbb{E}(Y) &= \int_0^\infty yf_Y(y) \d y \\ &&&= \int_0^\infty 2\pi y^2 k e^{-\pi k y^2} \d y \\ \sigma^2 = \frac{1}{2k\pi}:&&&= \pi k \sqrt{2 \pi}\sigma \int_{-\infty}^\infty \frac{1}{\sqrt{2 \pi} \sigma }y^2 e^{-\frac12 \cdot 2\pi k y^2} \d y \\ &&&=\pi k \sqrt{2 \pi}\sigma \mathbb{E}\left (N(0, \sigma^2)^2 \right) \\ &&&= \pi k \sqrt{2 \pi}\sigma\sigma^2 \\ &&&= \pi k \sqrt{2 \pi} \frac{1}{(2k\pi)^{3/2}} \\ &&&= \frac{1}{2\sqrt{k}} \end{align*} \begin{align*} && \mathbb{E}(Y^2) &= \int_0^\infty y^2f_Y(y) \d y \\ &&&= \int_0^\infty 2\pi y^3 k e^{-\pi k y^2} \d y \\ &&&= \int_0^{\infty}y^2 2y \pi k e^{-\pi k y^2} \d y \\ \\ &&&= \left [-y^2 e^{-\pi k y^2}\right]_0^{\infty}+\int_0^\infty 2ye^{-\pi k y^2} \d y \\ &&&= \left [-\frac{1}{\pi k}e^{-\pi k y^2} \right]_0^{\infty} \\ &&&= \frac{1}{\pi k} \\ \Rightarrow && \textrm{Var}(Y) &= \mathbb{E}(Y^2) - \left [ \mathbb{E}(Y)\right]^2 \\ &&&= \frac{1}{\pi k} - \frac{1}{4k} \\ &&&= \frac{4 - \pi}{4\pi k} \end{align*}

2010 Paper 1 Q13
D: 1484.0 B: 1516.0

The number of texts that George receives on his mobile phone can be modelled by a Poisson random variable with mean \(\lambda\) texts per hour. Given that the probability George waits between 1 and 2 hours in the morning before he receives his first text is \(p\), show that \[ p\e^{2\lambda}-\e^{\lambda}+1=0. \] Given that \(4p<1\), show that there are two positive values of \(\lambda\) that satisfy this equation. The number of texts that Mildred receives on each of her two mobile phones can be modelled by independent Poisson random variables with different means \(\lambda_{1}\) and \(\lambda_{2}\) texts per hour. Given that, for each phone, the probability that Mildred waits between 1 and 2 hours in the morning before she receives her first text is also \(p\), find an expression for \(\lambda_{1}+\lambda_{2}\) in terms of \(p\). Find the probability, in terms of \(p\), that she waits between 1 and 2 hours in the morning to receive her first text.


Solution: Let \(X_t\) be the number of texts he recieves before \(t\) hours. So \(X_t \sim P(t\lambda)\) \begin{align*} &&\mathbb{P}(X_1 = 0 \, \cap \, X_2 > 0) &= e^{-\lambda} \cdot \left ( 1-e^{-\lambda}\right) = p \\ \Rightarrow && e^{2\lambda}p &= e^{\lambda} - 1 \\ \Rightarrow && 0 &= pe^{2\lambda}-e^{\lambda} + 1 \\ \Rightarrow && e^{\lambda} &= \frac{1 \pm \sqrt{1-4p}}{2p} \end{align*} Which clearly has two positive roots if \(4p < 1\). We need to show both roots are \(>1\). So considering the smaller one we are looking at: \begin{align*} && \frac{1-\sqrt{1-4p}}{2p} & > 1 \\ \Leftrightarrow && 1-\sqrt{1-4p} &> 2p \\ \Leftrightarrow && 1-2p&> \sqrt{1-4p} \\ \Leftrightarrow && (1-2p)^2&> 1-4p \\ \Leftrightarrow && 1-4p+4p^2&> 1-4p \\ \end{align*} which is clearly true. We must have \(e^{\lambda_1}\cdot e^{\lambda_2} = \frac{1}{p}\), so \(\lambda_1 + \lambda_2 = -\ln p\) by considering the product of the roots in our quadratic. (Vieta). Therefore the probability she waits between 1 and 2 hours in the morning is \(e^{-(\lambda_1 + \lambda_2)} \cdot ( 1- e^{-(\lambda_1+\lambda_2)}) = p \cdot (1-p)\)

2007 Paper 1 Q14
D: 1500.0 B: 1484.0

The discrete random variable \(X\) has a Poisson distribution with mean \(\lambda\).

  1. Sketch the graph \(y=\l x+1 \r \e^{-x}\), stating the coordinates of the turning point and the points of intersection with the axes. It is known that \(\P(X \ge 2) = 1-p\), where \(p\) is a given number in the range \(0 < p <1\). Show that this information determines a unique value (which you should not attempt to find) of \(\lambda\).
  2. It is known (instead) that \(\P \l X = 1 \r = q\), where \(q\) is a given number in the range \(0 < q <1\). Show that this information determines a unique value of \(\lambda\) (which you should find) for exactly one value of \(q\) (which you should also find).
  3. It is known (instead) that \(\P \l X = 1 \, \vert \, X \le 2 \r = r\), where \(r\) is a given number in the range \(0 < r < 1\). Show that this information determines a unique value of \(\lambda\) (which you should find) for exactly one value of \(r\) (which you should also find).


Solution: Let \(X \sim Po(\lambda)\), then

  1. \(\,\)
    TikZ diagram
    Suppose \(\mathbb{P}(X \geq 2) = 1-p\) then \(\mathbb{P}(X=0) + \mathbb{P}(X=1) = p\), ie \(e^{-\lambda} +\lambda e^{-\lambda} = p\) If \(f(x) = (1+x)e^{-x}\) we have see it is strictly decreasing on \(x \geq 0\) and takes all values from \(1\) to \(0\), therefore we can find a unique value such that \(f(\lambda) = p\) which is our desired \(\lambda\).
  2. Note that \(\mathbb{P}(X = 1) = \lambda e^{-\lambda}\)
    TikZ diagram
    Sketching \(y = xe^{-x}\) and finding it's turning point we can see that there is a unique value of \(\lambda = 1\) when \(q = \frac{1}{e}\), otherwise there is either no solution (\(p > \frac1{e}\)) or two solutions (\(0 < q > \frac1{e}\)).
  3. Suppose \(\mathbb{P}(X = 1 | X \leq 2) = r\), ie \begin{align*} && r &= \frac{\lambda e^{-\lambda}}{e^{-\lambda} + \lambda e^{-\lambda} + \frac12 \lambda^2 e^{-\lambda}} \\ &&&= \frac{2\lambda}{2+2\lambda+\lambda^2} \\ \Rightarrow && 0 &= r\lambda^2 + 2(r-1) \lambda + 2r\\ \Rightarrow && \Delta &= 4(r-1)^2 - 4\cdot r \cdot 2 r \\ &&&= 4((r-1)^2-2r^2) \\ &&&= 4(r-1-\sqrt{2}r)(r-1+\sqrt{2}r) \\ &&&= -4((\sqrt{2}-1)r + 1)((1+\sqrt{2})r-1) \end{align*} Therefore our quadratic in \(r\) has a unique solution if \(r = \frac{1}{1+\sqrt{2}}\). If it has a positive solution then note since \(2r > 0\) both solutions are positive, so \(\lambda\) is not unique by excluding other solutions.

2006 Paper 1 Q13
D: 1484.0 B: 1468.0

A very generous shop-owner is hiding small diamonds in chocolate bars. Each diamond is hidden independently of any other diamond, and on average there is one diamond per kilogram of chocolate.

  1. I go to the shop and roll a fair six-sided die once. I decide that if I roll a score of \(N\), I will buy \(100N\) grams of chocolate. Show that the probability that I will have no diamonds is \[ \frac{\e^{-0.1}}{ 6} \l \frac{1 - \e^{-0.6} }{ 1 - \e^{-0.1}} \r \] Show also that the expected number of diamonds I find is 0.35.
  2. Instead, I decide to roll a fair six-sided die repeatedly until I score a 6. If I roll my first 6 on my \(T\)th throw, I will buy \(100T\) grams of chocolate. Show that the probability that I will have no diamonds is \[ \frac{\e^{-0.1}}{ 6 - 5\e^{-0.1}} \] Calculate also the expected number of diamonds that I find. (You may find it useful to consider the the binomial expansion of \(\l 1 - x \r^{-2}\).)


Solution: Not that the number of diamonds per kilogram is \(1\) so we are assuming it is \(Po(M)\) where \(M\) is the mass in kg. In particular \(\E[X] = M\) and \(\mathbb{P}(X = 0) = e^{-M}\)

  1. \(\,\) \begin{align*} && \mathbb{P}(\text{no diamonds}) &= \sum_{n=1}^6\mathbb{P}(\text{no diamonds and roll }n) \\ &&&= \sum_{n=1}^6 \tfrac16 e^{-\frac{n}{10}} \\ &&&= \frac{e^{-0.1}}6 \left ( \frac{1-e^{-0.6}}{1-e^{-0.1}}\right) \\ && \E[\text{diamonds}] &= \sum_{n=1}^6 \E(\text{diamonds}|N = n)\mathbb{P}(N = n) \\ &&&= \sum_{n=1}^6 0.1n \cdot \frac16 \\ &&&= 0.1 \cdot \frac{7}{2} = 0.35 \end{align*}
  2. \(\mathbb{P}(T = k) = \left ( \frac56 \right)^{t-1} \frac16\), so \begin{align*} && \mathbb{P}(\text{no diamonds}) &= \sum_{n=1}^\infty\mathbb{P}(\text{no diamonds and }T=n) \\ &&&= \sum_{n=1}^\infty e^{-0.1n} \left ( \frac56 \right)^{n-1} \frac16 \\ &&&= \frac{e^{-0.1}}{6} \frac1{1- \frac56 e^{-0.1}} \\ &&&= \frac{e^{-0.1}}{6 - 5e^{-0.1}} \\ \\ && \E[\text{diamonds}] &= \sum_{n=1}^\infty \E(\text{diamonds}|T = n)\mathbb{P}(T = n) \\ &&&= \sum_{n=1}^\infty 0.1n \cdot \left ( \frac56 \right)^{n-1} \frac16 \\ &&&= \frac{0.1}{6} \sum_{n=1}^\infty n \cdot \left ( \frac56 \right)^{n-1} \\ &&&= \frac{1}{60} \frac{1}{(1- \tfrac56)^2} \\ &&&= \frac{6}{10} = \frac35 \end{align*}

2006 Paper 2 Q12
D: 1600.0 B: 1516.0

A cricket team has only three bowlers, Arthur, Betty and Cuba, each of whom bowls 30 balls in any match. Past performance reveals that, on average, Arthur takes one wicket for every 36 balls bowled, Betty takes one wicket for every 25 balls bowled, and Cuba takes one wicket for every 41 balls bowled.

  1. In one match, the team took exactly one wicket, but the name of the bowler was not recorded. Using a binomial model, find the probability that Arthur was the bowler.
  2. Show that the average number of wickets taken by the team in a match is approximately 3. Give with brief justification a suitable model for the number of wickets taken by the team in a match and show that the probability of the team taking at least five wickets in a given match is approximately \(\frac15\). [You may use the approximation \(\e^3 = 20\).]


Solution:

  1. \(\,\) \begin{align*} && \mathbb{P}(\text{Arthur took wicket and exactly one wicket}) &= \binom{30}{1} \frac{1}{36} \left ( \frac{35}{36} \right)^{29} \binom{30}{0} \left ( \frac{24}{25} \right)^{30} \binom{30}{0} \left ( \frac{40}{41} \right)^{30}\\ &&&= \frac{30 \cdot 35^{29} \cdot 24^{30} \cdot 40^{30}}{36^{30} \cdot 25^{30} \cdot {41}^{30}}\\ &&&= \frac{1}{35} N\\ && \mathbb{P}(\text{B took wicket and exactly one wicket}) &= \binom{30}{0}\left ( \frac{35}{36} \right)^{30} \binom{30}{1} \frac{1}{25} \left ( \frac{24}{25} \right)^{29} \binom{30}{0} \left ( \frac{40}{41} \right)^{30}\\ &&&= \frac{1}{24} N \\ && \mathbb{P}(\text{C took wicket and exactly one wicket}) &= \binom{30}{0}\left ( \frac{35}{36} \right)^{30} \binom{30}{0}\left ( \frac{24}{25} \right)^{30} \binom{30}{1} \frac{1}{41} \left ( \frac{40}{41} \right)^{29}\\ &&&= \frac{1}{40} N \\ && \mathbb{P}(\text{Arthur took wicket} | \text{exactly one wicket}) &= \frac{ \mathbb{P}(\text{Arthur took wicket and exactly one wicket}) }{ \mathbb{P}(\text{exactly one wicket}) } \\ &&&= \frac{ \frac{1}{35} N}{\frac1{35} N + \frac{1}{24}N + \frac{1}{40} N} \\ &&&= \frac{3}{10} \end{align*} Alternatively, we could look at: \begin{align*} && \mathbb{P}(X_A = 1 | X_A + X_B + X_C =1) &= \frac{\mathbb{P}(X_A = 1, X_B = 0,X_C = 0)}{\mathbb{P}(X_A = 1, X_B = 0,X_C = 0)+\mathbb{P}(X_A = 0, X_B = 1,X_C = 0)+\mathbb{P}(X_A = 0, X_B = 0,X_C = 1)} \\ &&&= \frac{\frac{\mathbb{P}(X_A = 1)}{\mathbb{P}(X_A=0)}}{\frac{\mathbb{P}(X_A = 1)}{\mathbb{P}(X_A=0)}+\frac{\mathbb{P}(X_B = 1)}{\mathbb{P}(X_B=0)}+\frac{\mathbb{P}(X_C = 1)}{\mathbb{P}(X_C=0)}} \end{align*} and we can calculate these relatively likelihoods in a similar way to above.
  2. \(\,\) \begin{align*} && \mathbb{E}(\text{number of wickets}) &= \mathbb{E} \left ( \sum_{i=1}^{90} \mathbb{1}_{i\text{th ball is a wicket}} \right) \\ &&&= \sum_{i=1}^{90} \mathbb{E} \left (\mathbb{1}_{i\text{th ball is a wicket}} \right) \\ &&&= 30 \cdot \frac{1}{36} + 30 \cdot \frac{1}{25} + 30 \cdot \frac{1}{41} \\ &&&\approx 1 + 1 + 1 = 3 \end{align*} We might model the number of wickets taken as \(Po(\lambda)\), where \(\lambda\) is the average number of wickets taken. We can think of this roughly as the Poisson approximation to the binomial where \(N\) is large and \(Np\) is small. Assuming we use \(Po(3)\) we have \begin{align*} && \mathbb{P}(\text{at least 5 wickets}) &= 1-\mathbb{P}(\text{4 or fewer wickets}) \\ &&&= 1- e^{-3} \left (1 + \frac{3}{1} + \frac{3^2}{2} + \frac{3^3}{6} + \frac{3^4}{24} \right) \\ &&&= 1 - \frac{1}{20} \left ( 1 + 3 + \frac{9}{2} + \frac{9}{2} + \frac{27}{8} \right) \\ &&&= 1 - \frac{1}{20} \left (13 + 3\tfrac38 \right) \\ &&&\approx 1 - \frac{16}{20} = \frac15 \end{align*}

2005 Paper 2 Q13
D: 1600.0 B: 1500.0

The number of printing errors on any page of a large book of \(N\) pages is modelled by a Poisson variate with parameter \(\lambda\) and is statistically independent of the number of printing errors on any other page. The number of pages in a random sample of \(n\) pages (where \(n\) is much smaller than \(N\) and \(n\ge2\)) which contain fewer than two errors is denoted by \(Y\). Show that \(\P(Y=k) = \binom n k p^kq^{n-k}\) where \(p=(1+\lambda)e^{-\lambda}\) and \(q=1-p\,\). Show also that, if \(\lambda\) is sufficiently small,

  1. \(q\approx \frac12 \lambda^2\,\);
  2. the largest value of \(n\) for which \(\P(Y=n)\ge 1-\lambda\) is approximately \(2/\lambda\,\);
  3. \(\P(Y>1 \;\vert\; Y>0) \approx 1-n(\lambda^2/2)^{n-1}\;.\)


Solution: First notice that the the probability a page contains fewer than two errors is \(\mathbb{P}(X < 2)\) where \(X \sim Po(\lambda)\), ie \(\mathbb{P}(X<2) = e^{-\lambda} + \lambda e^{-\lambda} = (1+\lambda)e^{-\lambda}\). Therefore the number of pages \(Y\) with fewer than two errors out of our sample of \(n\) is \(Bin(n, p)\) where \(p\) is as before. ie \(\mathbb{P}(Y = k) = \binom{n}{k} p^kq^{n-k}\).

  1. \(\,\) \begin{align*} && q &= 1- p = 1-(1+\lambda)e^{-\lambda} \\ &&&= 1 - (1+ \lambda)(1 - \lambda + \tfrac12 \lambda^2 + o(\lambda^3)) \\ &&&= 1 - 1+ \lambda - \lambda+\lambda^2 - \tfrac12 \lambda^2 + o(\lambda^3) \\ &&&= \tfrac12 \lambda^2 + o(\lambda^3) \end{align*}
  2. \(\,\) \begin{align*} && \mathbb{P}(Y = n) &= p^n \\ &&&= (1+\lambda)^ne^{-\lambda n} \\ &&&= (1 + n \lambda + \frac{n(n-1)}{2} \lambda^2 + \cdots)(1 - \lambda n + \frac{\lambda^2 n^2}{2} + \cdots) \\ &&&= 1 + 0 \lambda + \left ( \frac{n(n-1)}{2} + \frac{n^2}{2} - n^2 \right) \lambda^2 + o(\lambda^3) \\ &&&= 1 - \frac{n}{2} \lambda^2 + o(\lambda^3) \end{align*} So if \(\frac{n}{2} \lambda \leq 1\) or \(n \leq \frac{2}{\lambda}\) \(\mathbb{P}(Y = n) \leq 1- \lambda\).
  3. \(\,\) \begin{align*} && \mathbb{P}(Y > 1 | Y > 0) &= \frac{1-(q^n + npq^{n-1})}{1-q^n} \\ &&&= 1 - \frac{npq^{n-1}}{1-q^n} \\ &&&= 1 -n \frac{(1+ \lambda)e^{-\lambda} (\tfrac12 \lambda^2 + o(\lambda^3))^{n-1}}{1-(\tfrac12 \lambda^2 + o(\lambda^3))^n} \\ &&&= 1 - n \left (\frac{\lambda^2}{2} \right)^{n-1} \frac{(1+ \lambda)(1-\lambda + \lambda^2/2 - \cdots)(1+o(\lambda)^{n-1}}{1-(\tfrac12 \lambda^2 + o(\lambda^3))^n} \\ &&&= 1 - n \left (\frac{\lambda^2}{2} \right)^{n-1} (1 + o(\lambda)) \\ &&&\approx 1 - n \left (\frac{\lambda^2}{2} \right)^{n-1} \end{align*}