Problems

Filters
Clear Filters

62 problems found

1997 Paper 2 Q14
D: 1600.0 B: 1469.6

Traffic enters a tunnel which is 9600 metres long, and in which overtaking is impossible. The number of vehicles which enter in any given time is governed by the Poisson distribution with mean 6 cars per minute. All vehicles travel at a constant speed until forced to slow down on catching up with a slower vehicle ahead. I enter the tunnel travelling at 30 m\(\,\)s\(^{-1}\) and all the other traffic is travelling at 32 m\(\,\)s\(^{-1}\). What is the expected number of vehicles in the queue behind me when I leave the tunnel? Assuming again that I travel at 30 m\(\,\)s\(^{-1}\), but that all the other vehicles are independently equally likely to be travelling at 30 m\(\,\)s\(^{-1}\) or 32 m\(\,\)s\(^{-1}\), find the probability that exactly two vehicles enter the tunnel within 20 seconds of my doing so and catch me up before I leave it. Find also the probability that there are exactly two vehicles queuing behind me when I leave the tunnel. \noindent [Ignore the lengths of the vehicles.]

1996 Paper 1 Q12
D: 1484.0 B: 1485.4

An examiner has to assign a mark between 1 and \(m\) inclusive to each of \(n\) examination scripts (\(n\leqslant m\)). He does this randomly, but never assigns the same mark twice. If \(K\) is the highest mark that he assigns, explain why \[ \mathrm{P}(K=k)=\left.\binom{k-1}{n-1}\right/\binom{m}{n} \] for \(n\leqslant k\leqslant m,\) and deduce that \[ \sum_{k=n}^{m}\binom{k-1}{n-1}=\binom{m}{n}\,. \] Find the expected value of \(K\).


Solution: If the highest mark is \(k\), then there are \(n-1\) remaining marks to give, and they have to be chosen from the numbers \(1, 2, \ldots, k-1\), ie in \(\binom{k-1}{n-1}\) ways. There are \(n\) numbers to be chosen from \(1, 2, \ldots, m\) in total, therefore \(\displaystyle \mathbb{P}(K=k) = \left.\binom{k-1}{n-1} \right/ \binom{m}{n}\) Since \(K\) can take any of the values \(n, \cdots, m\), we must have \begin{align*} && 1 &= \sum_{k=n}^m \mathbb{P}(K=k) \\ &&&= \sum_{k=n}^m \left.\binom{k-1}{n-1} \right/ \binom{m}{n} \\ \Rightarrow && \binom{m}{n} &= \sum_{k=n}^m \binom{k-1}{n-1} \\ \\ && \mathbb{E}(K) &= \sum_{k=n}^m k \cdot \mathbb{P}(K=k) \\ &&&= \sum_{k=n}^m k \cdot \left.\binom{k-1}{n-1} \right/ \binom{m}{n} \\ &&&= n\binom{m}{n}^{-1} \sum_{k=n}^m \frac{k}{n} \cdot \binom{k-1}{n-1} \\ &&&= n\binom{m}{n}^{-1} \sum_{k=n}^m \binom{k}{n} \\ &&&= n\binom{m}{n}^{-1} \sum_{k=n+1}^{m+1} \binom{k-1}{n+1-1} \\ &&&= n\binom{m}{n}^{-1} \binom{m+1}{n+1} \\ &&&= n \cdot \frac{m+1}{n+1} \end{align*}

1996 Paper 2 Q12
D: 1600.0 B: 1500.0

  1. Let \(X_{1}, X_{2}, \dots, X_{n}\) be independent random variables each of which is uniformly distributed on \([0,1]\). Let \(Y\) be the largest of \(X_{1}, X_{2}, \dots, X_{n}\). By using the fact that \(Y<\lambda\) if and only if \(X_{j}<\lambda\) for \(1\leqslant j\leqslant n\), find the probability density function of \(Y\). Show that the variance of \(Y\) is \[\frac{n}{(n+2)(n+1)^{2}}.\]
  2. The probability that a neon light switched on at time \(0\) will have failed by a time \(t>0\) is \(1-\mathrm{e}^{-t/\lambda}\) where \(\lambda>0\). I switch on \(n\) independent neon lights at time zero. Show that the expected time until the first failure is \(\lambda/n\).


Solution:

  1. \(\,\) \begin{align*} && F_Y(\lambda) &= \mathbb{P}(Y < \lambda) \\ &&&= \prod_i \mathbb{P}(X_i < \lambda) \\ &&&= \lambda^n \\ \Rightarrow && f_Y(\lambda) &= \begin{cases} n \lambda^{n-1} & \text{if } 0 \leq \lambda \leq 1 \\ 0 & \text{otherwise} \end{cases} \\ \\ && \E[Y] &= \int_0^1 \lambda f_Y(\lambda) \d \lambda \\ &&&= \int_0^1 n \lambda^n \d \lambda \\ &&&= \frac{n}{n+1} \\ && \E[Y^2] &= \int_0^1 \lambda^2 f_Y(\lambda) \d \lambda \\ &&&= \int_0^1 n \lambda^{n+1} \d \lambda \\ &&&= \frac{n}{n+2} \\ \Rightarrow && \var[Y] &= \E[Y^2]-(\E[Y])^2 \\ &&&= \frac{n}{n+2} - \frac{n^2}{(n+1)^2} \\ &&&= \frac{(n+1)^2n-n^2(n+2)}{(n+2)(n+1)^2} \\ &&&= \frac{n[(n^2+2n+1)-(n^2+2n)]}{(n+2)(n+1)^2} \\ &&&= \frac{n}{(n+2)(n+1)^2} \end{align*}
  2. Using the same reasoning, we can see that \begin{align*} && 1-F_Z(t) &= \mathbb{P}(\text{all lights still on after t}) \\ &&&= \prod_i e^{-t/\lambda} \\ &&&= e^{-nt/\lambda} \\ \\ \Rightarrow && F_Z(t) &= 1-e^{-nt/\lambda} \end{align*} Therefore \(Z \sim Exp(\frac{n}{\lambda})\) and the time to first failure is \(\lambda/n\)

1996 Paper 3 Q14
D: 1700.0 B: 1484.0

Whenever I go cycling I start with my bike in good working order. However if all is well at time \(t\), the probability that I get a puncture in the small interval \((t,t+\delta t)\) is \(\alpha\,\delta t.\) How many punctures can I expect to get on a journey during which my total cycling time is \(T\)? When I get a puncture I stop immediately to repair it and the probability that, if I am repairing it at time \(t\), the repair will be completed in time \((t,t+\delta t)\) is \(\beta\,\delta t.\) If \(p(t)\) is the probability that I am repairing a puncture at time \(t\), write down an equation relating \(p(t)\) to \(p(t+\delta t)\), and derive from this a differential equation relating \(p'(t)\) and \(p(t).\) Show that \[ p(t)=\frac{\alpha}{\alpha+\beta}(1-\mathrm{e}^{-(\alpha+\beta)t}) \] satisfies this differential equation with the appropriate initial condition. Find an expression, involving \(\alpha,\beta\) and \(T\), for the time expected to be spent mending punctures during a journey of total time \(T\). Hence, or otherwise, show that, the fraction of the journey expected to be spent mending punctures is given approximately by \[ \quad\frac{\alpha T}{2}\quad\ \mbox{ if }(\alpha+\beta)T\text{ is small, } \] and by \[ \frac{\alpha}{\alpha+\beta}\quad\mbox{ if }(\alpha+\beta)T\text{ is large.} \]

1995 Paper 2 Q13
D: 1600.0 B: 1484.0

Fly By Night Airlines run jumbo jets which seat \(N\) passengers. From long experience they know that a very small proportion \(\epsilon\) of their passengers fail to turn up. They decide to sell \(N+k\) tickets for each flight. If \(k\) is very small compared with \(N\) explain why they might expect \[ \mathrm{P}(r\mbox{ passengers fail to turn up})=\frac{\lambda^{r}}{r!}\mathrm{e}^{-\lambda} \] approximately, with \(\lambda=N\epsilon.\) For the rest of the question you may assume that the formula holds exactly. Each ticket sold represents \(\pounds A\) profit, but the airline must pay each passenger that it cannot fly \(\pounds B\) where \(B>A>0.\) Explain why, if \(r\) passengers fail to turn up, its profit, in pounds, is \[ A(N+k)-B\max(0,k-r), \] where \(\max(0,k-r)\) is the larger of \(0\) and \(k-r.\) Write down the expected profit \(u_{k}\) when \(k=0,1,2\) and \(3.\) Find \(v_{k}=u_{k+1}-u_{k}\) for general \(k\) and show that \(v_{k}>v_{k+1}.\) Show also that \[ v_{k}\rightarrow A-B \] as \(k\rightarrow\infty.\) Advise Fly By Night on how to choose \(k\) to maximise its expected profit \(u_{k}.\)

1994 Paper 2 Q13
D: 1600.0 B: 1629.1

The makers of Cruncho (`The Cereal Which Cares') are giving away a series of cards depicting \(n\) great mathematicians. Each packet of Cruncho contains one picture chosen at random. Show that when I have collected \(r\) different cards the expected number of packets I must open to find a new card is \(n/(n-r)\) where \(0\leqslant r\leqslant n-1.\) Show by means of a diagram, or otherwise, that \[ \frac{1}{r+1}\leqslant\int_{r}^{r+1}\frac{1}{x}\,\mathrm{d}x\leqslant\frac{1}{r} \] and deduce that \[ \sum_{r=2}^{n}\frac{1}{r}\leqslant\ln n\leqslant\sum_{r=1}^{n-1}\frac{1}{r} \] for all \(n\geqslant2.\) My children will give me no peace until we have the complete set of cards, but I am the only person in our household prepared to eat Cruncho and my spouse will only buy the stuff if I eat it. If \(n\) is large, roughly how many packets must I expect to consume before we have the set?

1994 Paper 2 Q14
D: 1600.0 B: 1502.2

When Septimus Moneybags throws darts at a dart board they are certain to end on the board (a disc of radius \(a\)) but, it must be admitted, otherwise are uniformly randomly distributed over the board.

  1. Show that the distance \(R\) that his shot lands from the centre of the board is a random variable with variance \(a^{2}/18.\)
  2. At a charity fete he can buy \(m\) throws for \(\pounds(12+m)\), but he must choose \(m\) before he starts to throw. If at least one of his throws lands with \(a/\sqrt{10}\) of the centre he wins back \(\pounds 12\). In order to show that a good sport he is, he is determined to play but, being a careful man, he wishes to choose \(m\) so as to minimise his expected loss. What values of \(m\) should he choose?


Solution:

  1. \(\,\) \begin{align*} && \mathbb{P}(R < d) &= \frac{\pi d^2}{\pi a^2} \\ &&&= \frac{d^2}{a^2} \\ \Rightarrow && f_R(d) &= \frac{2d}{a^2}\\ \\ && \E[R] &= \int_0^a x \cdot f_R(x) \d x \\ &&&= \int_0^a \frac{2x^2}{a^2} \d x \\ &&&= \frac{2a}{3} \\ \\ && \E[R^2] &= \int_0^a x^2 \cdot f_R(x) \d x \\ &&&= \int_0^a \frac{2x^3}{a^2} \d x \\ &&&= \frac{a^2}{2} \\ \Rightarrow && \var[R] &= \frac{a^2}2 - \frac{4a^2}{9} \\ &&7= \frac{a^2}{18} \end{align*}
  2. Let \(p = \mathbb{P}(R < \frac{a}{\sqrt{10}}) = \frac{a^2}{10a^2} = \frac{1}{10}\) be the probability of hitting the target on each throw. His expected loss is \((12+m)p^m + m(1-p^m) = 12p^m + m\). \begin{array}{c|c} m & \text{expected loss} \\ \hline 0 & 12 \\ 1 & \frac{12}{10} + 1 \approx 2.2 \\ 2 & \frac{12}{100} + 2 \approx 2.12 \\ \end{array} If he takes more than \(2\) throws it will definitely cost more than \(3\), therefore he should take exactly \(2\) throws.

1992 Paper 1 Q15
D: 1484.0 B: 1471.3

Trains leave Barchester Station for London at 12 minutes past the hour, taking 60 minutes to complete the journey and at 48 minutes past the hour taking 75 minutes to complete the journey. The arrival times of passengers for London at Barchester Station are uniformly distributed over the day and all passengers take the first available train. Show that their average journey time from arrival at Barchester Station to arrival in London is 84.6 minutes. Suppose that British Rail decide to retime the fast 60 minute train so that it leaves at \(x\) minutes past the hour. What choice of \(x\) will minimise the average journey time?


Solution: If you arrive between 12 to and 12 past, it will take 60 minutes + how many minutes you wait at the station. If you arrive between 12 past and 12 to, it will take 75 minutes plus waiting at the station. Let's say arrival time \(X \sim U(0,60)\) minutes past the hour, then travel time is. Let's say there are two random variables, \(X_{fast} \sim U(0,24)\) \(X_{slow} \sim U(0, 36)\). Then if you wait for a fast train your expected wait time is \(72\), if you wait for a slow time, your expected wait time is \(75 + 18 = 93\). There is a \(\frac{24}{60} = \frac{4}{10}\) chance of being in the first case, and \(\frac{6}{10}\) chance of the second, ie: \(\frac{4}{10} \cdot 72 + \frac{6}{10} \cdot 93 = \frac{846}{10} = 84.6\) expected wait time. Suppose the time the trains so the expected fraction of time waiting for the fast train is \(t\) and the slow train is \(1-t\). Then the expected time is: \begin{align*} t \l 30t + 60 \r + (1-t) \l 30(1-t) + 75 \r &= 60t^2 -75t + 105 \\ &= 60 \l t^2 - \frac{5}{4}t \r + 105 \\ &= 60 \l t - \frac{5}{8} \r^2 - ? + 105 \\ \end{align*} Threfore we should choose \(x\) such that \(t = \frac58\), which is \(~37.5\) minutes after the slower train, \(25.5\) minutes past.

1990 Paper 1 Q14
D: 1500.0 B: 1500.7

A bag contains 5 white balls, 3 red balls and 2 black balls. In the game of Blackball, a player draws a ball at random from the bag, looks at it and replaces it. If he has drawn a white ball, he scores one point, while for a red ball he scores two points, these scores being added to his total score before he drew the ball. If he has drawn a black ball, the game is over and his final score is zero. After drawing a red or white ball, he can either decide to stop, when his final score for the game is the total so far, or he may elect to draw another ball. The starting score is zero. Juggins' strategy is to continue drawing until either he draws a black ball (when of course he must stop, with final score zero), or until he has drawn three (non-black) balls, when he elects to stop. Find the probability that in any game he achieves a final score of zero by employing this strategy. Find also his expected final score. Muggins has so far scored \(N\) points, and is deciding whether to draw another ball. Find the expected score if another ball is drawn, and suggest a strategy to achieve the greatest possible average final score in each game.


Solution: The probability Juggin's has a non-zero score is the probability he never draws a black ball in his three goes. This is \((1-\frac15)^3 = \frac{64}{125}\). Let's consider the \(\frac{61}{125}\) probability world where he never draws a black ball. In this conditional probability space, he has \(\frac{5}{8}\) chances of pulling out white balls and \(\frac38\) or pulling out red. His expected score per pull is \(\frac58 \cdot 1 + \frac38 \cdot 2 = \frac{11}{8}\). Therefore his expected score in this universe is \(\frac{33}8\) and his expected score is \(\frac{33}{8} \cdot \frac{61}{125} = \frac{2013}{1000} = 2.013\) . The expected score after drawing another ball is \(( N + 1)\frac{5}{10} + (N+2) \frac{3}{10} + 0 \cdot \frac{2}{10} = \frac{8}{10}N + \frac{11}{10}\). A sensible strategy would be to only draw if \(\frac{8}{10}N + \frac{11}{10} > N \Rightarrow N < \frac{11}{2}\), ie keep drawing until \(N \geq 6\) or we bust out. [The expected score for this strategy is: \begin{array}{ccc} \text{score} & \text{route} & \text{count} & \text{prob} \\ \hline 6 & \text{6 1s} & 1 & \left ( \frac12 \right)^6 \\ 6 & \text{4 1s, 1 2} & 5 & 5 \cdot \left ( \frac12 \right)^4 \cdot \frac{3}{10} \\ 6 & \text{2 1s, 2 2s} & 6 & 6 \cdot \left ( \frac12 \right)^2 \cdot \left ( \frac{3}{10} \right)^2 \\ 6 & \text{3 2s} & 1 & 1 \cdot \left ( \frac{3}{10} \right)^3 \\ 7 & \text{5 1s, 1 2} & 1 &\left ( \frac12 \right)^5 \cdot \frac{3}{10} \\ 7 & \text{3 1s, 2 2s} & 4 & 4\cdot \left ( \frac12 \right)^3 \cdot \left ( \frac{3}{10} \right)^2 \\ 7 & \text{1 1, 3 2s} & 3 & 3\cdot \left ( \frac12 \right) \cdot \left ( \frac{3}{10} \right)^3 \\ \end{array} For an expected value of \(\frac{2171}{8000} \cdot 6 + \frac{759}{8000} \cdot 7 = \frac{18\,339}{8000} = 2.29 \quad (3\text{ s.f.})\)]

1990 Paper 3 Q15
D: 1700.0 B: 1482.6

An unbiased twelve-sided die has its faces marked \(A,A,A,B,B,B,B,B,B,B,B,B.\) In a series of throws of the die the first \(M\) throws show \(A,\) the next \(N\) throws show \(B\) and the \((M+N+1)\)th throw shows \(A\). Write down the probability that \(M=m\) and \(N=n\), where \(m\geqslant0\) and \(n\geqslant1.\) Find

  1. the marginal distributions of \(M\) and \(N\),
  2. the mean values of \(M\) and \(N\).
Investigate whether \(M\) and \(N\) are independent. Find the probability that \(N\) is greater than a given integer \(k\), where \(k\geqslant1,\) and find \(\mathrm{P}(N > M).\) Find also \(\mathrm{P}(N=M)\) and show that \(\mathrm{P}(N < M)=\frac{1}{52}.\)


Solution: \begin{align*} \mathbb{P}(M = m, N = n) &= \left ( \frac{3}{12} \right)^m \left ( \frac{9}{12} \right)^n \frac{3}{12} \\ &= \frac{3^n}{4^{m+n+1}} \end{align*}

  1. \begin{align*} \mathbb{P}(M = m) &= \sum_{n = 1}^{\infty} \mathbb{P}(M=m,N=n) \\ &= \sum_{n = 1}^{\infty} \frac{3^n}{4^{m+n+1}} \\ &= \frac{1}{4^{m+1}} \sum_{n = 1}^{\infty} \left ( \frac34\right)^n \\ &= \frac{1}{4^{m+1}} \frac{3/4}{1/4} \\ &= \frac{3}{4^{m+1}} \\ \\ \mathbb{P}(N = n) &= \sum_{m = 0}^{\infty} \mathbb{P}(M=m,N=n) \\ &= \sum_{m = 0}^{\infty} \frac{3^n}{4^{m+n+1}} \\ &= \frac{3^n}{4^{n+1}} \sum_{m = 0}^{\infty} \left ( \frac14\right)^n \\ &= \frac{3^n}{4^{n+1}} \frac{1}{3/4} \\ &= \frac{3^{n-1}}{4^{n}} \\ \end{align*}
  2. \(M+1 \sim Geo(\frac34) \Rightarrow \mathbb{E}(M) = \frac43 -1 = \frac13\) \(N \sim Geo(\frac14) \Rightarrow \mathbb{E}(N) = 4\)
\(M,N\) are independent since \(\mathbb{P}(M = m, N =n ) = \mathbb{P}(M=m)\mathbb{P}(N=n)\) \begin{align*} \mathbb{P}(N > k) &= \sum_{n=k+1}^{\infty} \mathbb{P}(N = n) \\ &= \sum_{n=k+1}^{\infty} \frac{3^{n-1}}{4^{n}} \\ &= \frac{3^k}{4^{k+1}} \sum_{n = 0}^{\infty} \left ( \frac34\right)^n \\ &= \frac{3^k}{4^{k+1}} \frac{1}{1/4} \\ &= \frac{3^k}{4^k} \end{align*} \begin{align*} \mathbb{P}(N > M) &= \sum_{m=0}^{\infty} \mathbb{P}(N > m) \mathbb{P}(M = m) \\ &= \sum_{m=0}^{\infty} \left (\frac34 \right)^m \frac{3}{4^{m+1}}\\ &=\sum_{m=0}^{\infty} \frac{3^{m+1}}{4^{2m+1}}\\ &= \frac{3}{4} \frac{1}{13/16} \\ &= \frac{12}{13} \\ \\ \mathbb{P}(N=M) &= \sum_{m=1}^{\infty} \mathbb{P}(N=m, M=m) \\ &= \sum_{m=1}^{\infty} \frac{3^m}{4^{2m+1}} \\ &= \frac{3}{64} \sum_{m=0}^{\infty} \left ( \frac{3}{16} \right)^m \\ &= \frac{3}{64} \frac{1}{13/16} \\ &= \frac{3}{52}\\ \\ \mathbb{P}(N < M) &= 1 - \frac34 - \frac3{52} \\ &= 1 - \frac{48}{52} - \frac{3}{52} \\ &= 1 - \frac{51}{52} \\ &= \frac{1}{52} \end{align*}

1989 Paper 1 Q14
D: 1516.0 B: 1453.5

The prevailing winds blow in a constant southerly direction from an enchanted castle. Each year, according to an ancient tradition, a princess releases 96 magic seeds from the castle, which are carried south by the wind before falling to rest. South of the castle lies one league of grassy parkland, then one league of lake, then one league of farmland, and finally the sea. If a seed falls on land it will immediately grow into a fever tree. (Fever trees do not grow in water). Seeds are blown independently of each other. The random variable \(L\) is the distance in leagues south of the castle at which a seed falls to rest (either on land or water). It is known that the probability density function \(\mathrm{f}\) of \(L\) is given by \[ \mathrm{f}(x)=\begin{cases} \frac{1}{2}-\frac{1}{8}x & \mbox{ for }0\leqslant x\leqslant4,\\ 0 & \mbox{ otherwise.} \end{cases} \] What is the mean number of fever trees which begin to grow each year?

  1. The random variable \(Y\) is defined as the distance in leagues south of the castle at which a new fever tree grows from a seed carried by the wind. Sketch the probability density function of \(Y\), and find the mean of \(Y\).
  2. One year messengers bring the king the news that 23 new fever trees have grown in the farmland. The wind never varies, and so the king suspects that the ancient tradition have not been followed properly. Is he justified in his suspicions?


Solution: \begin{align*} \mathbb{P}(\text{fever tree grows}) &= \mathbb{P}(0 \leq L \leq 1) + \mathbb{P}(2 \leq L \leq 3) \\ &= \int_0^1 \frac12 -\frac18 x \d x + \int_2^3 \frac12 - \frac18 x \d x \\ &= \left [\frac12 x - \frac1{16}x^2 \right]_0^1+ \left [\frac12 x - \frac1{16}x^2 \right]_2^3 \\ &= \frac12 - \frac1{16}+\frac32-\frac9{16} - 1 + \frac{4}{16} \\ &= \frac58 \end{align*} The expected number of fever trees is just \(96 \cdot \frac58 = 60\).

  1. \(f_Y(t)\) must match the distribution for \(L\), but limited to the points we care about, therefore it should be: $f_Y(t) = \begin{cases} ( \frac45 - \frac15t ) & \text{if } t \in [0,1]\cup[2,3] \\ 0 & \text{otherwise} \end{cases}$
    TikZ diagram
    \begin{align*} \mathbb{E}(Y) &= \frac12 \cdot \frac15 (4 - \frac12)+\frac52 \cdot (1 - \frac15 (4 - \frac12)) \\ &= \frac12 \cdot \frac7{10} + \frac52 \cdot \frac3{10} \\ &= \frac{22}{20} \\ &= \frac{11}{10} \end{align*}
  2. Given the seeds are blown independently and the wind hasn't changed, it is reasonable to model the number of fever trees as \(B(96, \frac{5}{8})\), it is also acceptable to approximate this using a Normal distribution, ie \(N(60, 22.5)\), \(23\) is \(\frac{23-60}{\sqrt{22.5}}\) is a very negative number, so he should be extremely suspicious.

1989 Paper 1 Q16
D: 1516.0 B: 1470.2

A and B play a guessing game. Each simultaneously names one of the numbers \(1,2,3.\) If the numbers differ by 2, whoever guessed the smaller pays the opponent £\(2\). If the numbers differ by 1, whoever guessed the larger pays the opponent £\(1.\) Otherwise no money changes hands. Many rounds of the game are played.

  1. If A says he will always guess the same number \(N\), explain (for each value of \(N\)) how B can maximise his winnings.
  2. In an attempt to improve his play, A announces that he will guess each number at random with probability \(\frac{1}{3},\) guesses on different rounds being independent. To counter this, B secretly decides to guess \(j\) with probability \(b_{j}\) (\(j=1,2,3,\, b_{1}+b_{2}+b_{3}=1\)), guesses on different rounds being independent. Derive an expression for B's expected winnings on any round. How should the probabilities \(b_{j}\) be chosen so as to maximize this expression?
  3. A now announces that he will guess \(j\) with probability \(a_{j}\) (\(j=1,2,3,\, a_{1}+a_{2}+a_{3}=1\)). If B guesses \(j\) with probability \(b_{j}\) (\(j=1,2,3,\, b_{1}+b_{2}+b_{3}=1\)), obtain an expression for his expected winnings in the form \[ Xa_{1}+Ya_{2}+Za_{3}. \] Show that he can choose \(b_{1},b_{2}\) and \(b_{3}\) such that \(X,Y\) and \(Z\) are all non-negative. Deduce that, whatever values for \(a_{j}\) are chosen by A, B can ensure that in the long run he loses no money.


Solution:

  1. Suppose A always plays \(1\), then B should always play \(2\) and every time they will win 1. Suppose A always plays \(2\) then B should always play \(3\) and every time they will win 1. If A always plays \(3\) then B should always play \(1\) and every time they will win 2.
  2. \begin{array}{cccc} & b_1 & b_2 & b_3 \\ \frac13 & (0, \frac{b_1}{3}) & (1, \frac{b_2}{3}) & (-2, \frac{b_3}{3}) \\ \frac13 & (-1, \frac{b_1}{3}) & (0, \frac{b_2}{3}) & (1, \frac{b_3}{3}) \\ \frac13 & (2, \frac{b_1}{3}) & (-1, \frac{b_2}{3}) & (0, \frac{b_3}{3}) \\ \end{array} Therefore the expected value is: \(\frac{b_1}{3} - \frac{b_3}{3}\) and to maximise this he should always guess \(1\) (ie \(b_1 = 1, b_2 = 0, b_3 = 0\).)
  3. \begin{array}{cccc} & b_1 & b_2 & b_3 \\ a_1 & (0, a_1b_1) & (1, a_1b_2) & (-2, a_1b_3) \\ a_2 & (-1, a_2b_1) & (0, a_2b_2) & (1, a_2b_3) \\ a_3 & (2, a_3b_1) & (-1, a_3b_2) & (0, a_3b_3) \\ \end{array} Therefore the expected value is: \((b_2-2b_3)a_1 + (b_3-b_1)a_2 + (2b_1-b_2)a_3\) We need \(b_2 \geq 2b_3, b_3 \geq b_1, 2b_1 \geq b_2\) so \(b_1 \leq b_3 \leq \frac12 b_2 \leq b_1\) so we could take \(b_1 = b_3 = \frac12 b_2\) or \(b_1 = b_3 = \frac14, b_2 = \frac12\) and all values would be \(0\). Therefore by choosing these values \(B\) can guarantee his expected value is \(0\) and therefore shouldn't expect to lose money in the long run.

1989 Paper 2 Q15
D: 1600.0 B: 1484.0

Two points are chosen independently at random on the perimeter (including the diameter) of a semicircle of unit radius. What is the probability that exactly one of them lies on the diameter? Let the area of the triangle formed by the two points and the midpoint of the diameter be denoted by the random variable \(A\).

  1. Given that exactly one point lies on the diameter, show that the expected value of \(A\) is \(\left(2\pi\right)^{-1}\).
  2. Given that neither point lies on the diameter, show that the expected value of \(A\) is \(\pi^{-1}\). [You may assume that if two points are chosen at random on a line of length \(\pi\) units, the probability density function for the distance \(X\) between the two points is \(2\left(\pi-x\right)/\pi^{2}\) for \(0\leqslant x\leqslant\pi.\)]
Using these results, or otherwise, show that the expected value of \(A\) is \(\left(2+\pi\right)^{-1}\).


Solution:

  1. TikZ diagram
    \begin{align*} \mathbb{E}(A \mid \text{exactly one point on diameter}) &= \int_{-1}^1\int_0^\pi \frac12 (x-0)\cdot 1 \cdot \sin(\pi - \theta) \frac{1}{\pi} \d \theta \frac{1}{2} \d x \\ &= \int_{-1}^1\frac1{2\pi} x \d x \cdot \left [ -\cos \theta \right]_0^\pi \\ &= \frac{1}{2\pi} \end{align*}
  2. TikZ diagram
    \begin{align*} \mathbb{E}(A \mid \text{no point on diameter}) &= \int_0^{\pi} \frac12 \cdot 1 \cdot 1 \cdot \sin x \cdot 2(\pi - x)/\pi^2 \d x \\ &= \frac1{\pi^2} \int_0^\pi \sin x (\pi - x) \d x \\ &= \frac1{\pi^2} \int_0^\pi x\sin x \d x \\ &= \frac1{\pi^2} \left [ \sin x - x \cos x \right]_0^{\pi} \\ &= \frac{1}{\pi} \end{align*}
If both points lie on the diameter the area of the triangle is \(0\). Therefore: \begin{align*} \mathbb{E}(A) &= \frac{1}{2\pi} \mathbb{P}(\text{exactly one point on diameter}) + \frac{1}{\pi}\mathbb{P}(\text{no points on diameter}) \\ &= \frac1{2\pi} \cdot \left (2 \cdot \frac{2}{2+\pi} \cdot \frac{\pi}{2+\pi} \right) + \frac{1}{\pi} \cdot \left ( \frac{\pi}{2+\pi} \cdot \frac{\pi}{2+\pi}\right) \\ &= \frac{1}{\pi} \frac{2\pi + \pi^2}{(2+\pi)^2} \\ &= \frac{1}{2+\pi} \end{align*}

1989 Paper 2 Q16
D: 1600.0 B: 1484.0

Widgets are manufactured in batches of size \((n+N)\). Any widget has a probability \(p\) of being faulty, independent of faults in other widgets. The batches go through a quality control procedure in which a sample of size \(n\), where \(n\geqslant2\), is taken from each batch and tested. If two or more widgets in the sample are found to be faulty, all widgets in the batch are tested and all faults corrected. If fewer than two widgets in the sample are found to be faulty, the sample is replaced in the batch and no faults are corrected. Show that the probability that the batch contains exactly \(k\), where \(k\leqslant N\), faulty widgets after quality control is \[ \frac{\left[N+1+k\left(n-1\right)\right]N!}{\left(N-k+1\right)!k!}p^{k}\left(1-p\right)^{N+n-k}, \] and verify that this formula also gives the correct answer for \(k=N+1\). Show that the expected number of faulty widgets in a batch after quality control is \[ \left[N+n+pN(n-1)\right]p(1-p)^{n-1}. \]


Solution: \begin{align*} \mathbb{P}(\text{exactly }k\text{ faults after test}) &= \mathbb{P}(k\text{ faults in non-tested, 0 in batch})+\mathbb{P}(k-1\text{ faults in non-tested, 1 in batch}) \\ &=\binom{N}{k}(1-p)^{N-k}p^k\binom{n}{0}(1-p)^n+\binom{N}{k-1}(1-p)^{N-k+1}p^{k-1}\binom{n}{1}(1-p)^{n-1}p \\ &= (1-p)^{N-k+n}p^k \cdot \left ( \binom{N}{k}+n\binom{N}{k-1} \right) \\ &= (1-p)^{N-k+n}p^k \cdot \left (\frac{N!}{k!(N-k)!}+\frac{N!n}{(k-1)!(N-k+1)!}\right) \\ &= (1-p)^{N-k+n}p^k \frac{N!}{k!(N-k+1)!} \cdot \left ((N-k+1)+nk \right) \\ &= \frac{\left[N+1+k\left(n-1\right)\right]N!}{\left(N-k+1\right)!k!}p^{k}\left(1-p\right)^{N+n-k} \end{align*} When \(k = N+1\) we get: \begin{align*} \frac{(N+1)n N!}{(N+1)!} p^{N+1}(1-p)^{N+n-k} &= np^{N+1}(1-p)^{N+n-k} \end{align*} and the probability is: \begin{align*} \mathbb{P}(\text{exactly }N+1\text{ faults after test}) &= \mathbb{P}(N\text{ faults in non-tested, 1 in batch}) \\ &= \binom{N}{N}p^N \cdot \binom{n}{1}p(1-p)^{N-1} \\ &= np^{N+1}(1-p)^{N+n-k} \end{align*} So the formula does work for \(k = N+1\). \begin{align*} \mathbb{E}(faults) &= \sum_{k=0}^{N+1} k \cdot \mathbb{P}(\text{exactly }k\text{ faults after test}) \\ &= \sum_{k=0}^{N+1} k \cdot \frac{\left[N+1+k\left(n-1\right)\right]N!}{\left(N-k+1\right)!k!}p^{k}\left(1-p\right)^{N+n-k} \\ &= \sum_{k=1}^{N+1} \frac{\left[N+1+k\left(n-1\right)\right]N!}{\left(N-k+1\right)!(k-1)!}p^{k}\left(1-p\right)^{N+n-k} \\ &= \sum_{k=1}^{N+1} \left[N+1+k\left(n-1\right)\right] p(1-p)^{n-1}\binom{N}{k-1}p^{k-1}\left(1-p\right)^{N-k+1} \\ &= p(1-p)^{n-1} \cdot \left ( (N+1+n-1)\sum_{k=1}^{N+1} \binom{N}{k-1}p^{k-1}\left(1-p\right)^{N-k+1}+ (n-1)\sum_{k=1}^{N+1} (k-1)\binom{N}{k-1}p^{k-1}\left(1-p\right)^{N-k+1} \right) \\ &= p(1-p)^{n-1} \left ((N+1+n-1) + (n-1)pN \right) \\ &= \left[N+n+pN(n-1)\right]p(1-p)^{n-1} \end{align*}

1988 Paper 1 Q15
D: 1500.0 B: 1484.0

In Fridge football, each team scores two points for a goal and one point for a foul committed by the opposing team. In each game, for each team, the probability that the team scores \(n\) goals is \(\left(3-\left|2-n\right|\right)/9\) for \(0\leqslant n\leqslant4\) and zero otherwise, while the number of fouls committed against it will with equal probability be one of the numbers from \(0\) to \(9\) inclusive. The numbers of goals and fouls of each team are mutually independent. What is the probability that in some game a particular team gains more than half its points from fouls? In response to criticisms that the game is boring and violent, the ruling body increases the number of penalty points awarded for a foul, in the hope that this will cause large numbers of fouls to be less probable. During the season following the rule change, 150 games are played and on 12 occasions (out of 300) a team committed 9 fouls. Is this good evidence of a change in the probability distribution of the number of fouls? Justify your answer.


Solution: \begin{array}{c|c|c|c} k & \P(k \text{ goals}) & \P(\geq 2k+1 \text{ fouls}) & \P(k \text{ goals and } \geq 2k+1 \text{ fouls}) \\ \hline 0 & \frac{3-|2|}{9} = \frac19 & \frac{9}{10} & \frac{9}{90}\\ 1 & \frac{3-|2-1|}{9} = \frac29 & \frac{7}{10} & \frac{14}{90} \\ 2 & \frac{3-|2-2|}{9} = \frac39 & \frac{5}{10} & \frac{15}{90} \\ 3 & \frac{3-|2-3|}{9} = \frac29 & \frac{3}{10} & \frac{6}{90} \\ 4 & \frac{3-|2-4|}{9} = \frac19 & \frac{1}{10} & \frac{1}{90} \\ \hline &&& \frac{9+14+15+6+1}{90} = \frac12 \end{array} The probability a team scores more than half its points from fouls is \(\frac12\). Letting \(X\) be the number of times a team committed \(9\) fouls, then \(X \sim B(300, p)\). Consider two hypotheses: \(H_0: p = \frac1{10}\) \(H_1: p < \frac1{10}\) Under \(H_0\), we are interested in \(\P(X \leq 9)\). Since \(300 \frac{1}{10} > 5\) it is appropriate to use a normal approximation, \(N(30, 27)\). Therefore, \begin{align*} && \P(X \leq 9) &\approx \P(3\sqrt{3}Z + 30 \leq 9.5) \\ &&&= \P( Z \leq \frac{9.5-30}{3\sqrt{3}}) \\ &&&= \P(Z \leq \frac{-20.5}{3\sqrt{3}}) \\ &&&< \P(Z \leq -\frac{7}{2}) \end{align*} Which is very small. Therefore there is good evidence to believe there has been a change in the number of fouls.