Problems

Filters
Clear Filters

86 problems found

2001 Paper 2 Q13
D: 1600.0 B: 1517.3

The life times of a large batch of electric light bulbs are independently and identically distributed. The probability that the life time, \(T\) hours, of a given light bulb is greater than \(t\) hours is given by \[ \P(T>t) \; = \; \frac{1}{(1+kt)^\alpha}\;, \] where \(\alpha\) and \(k\) are constants, and \(\alpha >1\). Find the median \(M\) and the mean \(m\) of \(T\) in terms of \(\alpha\) and \(k\). Nine randomly selected bulbs are switched on simultaneously and are left until all have failed. The fifth failure occurs at 1000 hours and the mean life time of all the bulbs is found to be 2400 hours. Show that \(\alpha\approx2\) and find the approximate value of \(k\). Hence estimate the probability that, if a randomly selected bulb is found to last \(M\) hours, it will last a further \(m-M\) hours.


Solution: The median \(M\) is the value such that \begin{align*} && \frac12 &= \mathbb{P}(T > M) \\ &&&= \frac1{(1+kM)^\alpha} \\ \Rightarrow && 2 &= (1+kM)^{\alpha} \\ \Rightarrow && M &= \frac{2^{1/\alpha}-1}{k} \end{align*} The distribution of \(T\) is \(f_T(t) = \frac{k \alpha}{(1+kt)^{\alpha+1}}\) and so \begin{align*} && m &= \int_0^\infty t f_T(t) \d t \\ &&&= \int_0^\infty \frac{tk \alpha}{(1+kt)^{\alpha+1}} \d t \\ &&&= \int_0^\infty \frac{\alpha+tk \alpha-\alpha}{(1+kt)^{\alpha+1}} \d t \\ &&&= \alpha \int_0^\infty (1+kt)^{-\alpha} \d t - \alpha \int_0^\infty (1+kt)^{-(\alpha+1)} \d t \\ &&&= \alpha \left [ -\frac1{k(\alpha-1)}(1+kt)^{-\alpha+1}\right]_0^\infty- \alpha \left [ -\frac1{k\alpha}(1+kt)^{-\alpha}\right]_0^\infty \\ &&&= \frac{\alpha}{k(\alpha-1)} - \frac{1}{k} \\ &&&= \frac{1}{k(\alpha-1)} \end{align*} \begin{align*} && \frac{2^{1/\alpha}-1}{k} &= 1000 \\ && \frac{1}{k(\alpha-1)} &= 2400 \\ \Rightarrow && \frac{\alpha-1}{2^{1/\alpha}-1} &\approx 2.4 \\ && \frac{2-1}{\sqrt2-1} &= \sqrt{2}+1 \approx 2.4 \\ \Rightarrow && \alpha &\approx 2 \\ && k &= \frac{1}{2400} \end{align*} \begin{align*} && \mathbb{P}(T > m | T > M) &= \frac{\mathbb{P}(T > m)}{\mathbb{P}(T > M)} \\ &&&= \frac{2}{(1+km)^{\alpha}} \\ &&&= \frac{2}{(1 + \frac{1}{\alpha-1})^\alpha} \\ &&&\approx \frac{2}{4} =\frac12 \end{align*}

2001 Paper 3 Q13
D: 1700.0 B: 1500.0

In a game for two players, a fair coin is tossed repeatedly. Each player is assigned a sequence of heads and tails and the player whose sequence appears first wins. Four players, \(A\), \(B\), \(C\) and \(D\) take turns to play the game. Each time they play, \(A\) is assigned the sequence TTH (i.e.~Tail then Tail then Head), \(B\) is assigned THH, \(C\) is assigned HHT and \(D\) is assigned~HTT.

  1. \(A\) and \(B\) play the game. Let \(p_{\mathstrut\mbox{\tiny HH}}\), \(p_{\mathstrut\mbox{\tiny HT}}\), \(p_{\mathstrut\mbox{\tiny TH}}\) and \(p_{\mathstrut\mbox{\tiny TT}}\) be the probabilities of \(A\) winning the game given that the first two tosses of the coin show HH, HT, TH and TT, respectively. Explain why \(p_{\mathstrut\mbox{\tiny TT}} = 1\,\), and why $p_{\mathstrut\mbox{\tiny HT}} = {1 \over 2} \, p_{\mathstrut\mbox{\tiny TH}} + {1\over 2} \, p_{\mathstrut\mbox{\tiny TT}}\,$. Show that $p_{\mathstrut\mbox{\tiny HH}} = p_{\mathstrut\mbox{\tiny HT}} = {2 \over 3}$ and that \(p_{\mathstrut\mbox{\tiny TH}} = {1\over 3}\,\). Deduce that the probability that A wins the game is \({2\over 3}\,\).
  2. \(B\) and \(C\) play the game. Find the probability that \(B\) wins.
  3. Show that if \(C\) plays \(D\), then \(C\) is more likely to win than \(D\), but that if \(D\) plays \(A\), then \(D\) is more likely to win than \(A\).

2000 Paper 1 Q12
D: 1500.0 B: 1480.9

I have \(k\) different keys on my key ring. When I come home at night I try one key after another until I find the key that fits my front door. What is the probability that I find the correct key in exactly \(n\) attempts in each of the following three cases?

  1. At each attempt, I choose a key that I have not tried before but otherwise each choice is equally likely.
  2. At each attempt, I choose a key from all my keys and each of the \(k\) choices is equally likely.
  3. At the first attempt, I choose from all my keys and each of the \(k\) choices is equally likely. Thereafter, I choose from the keys that I did not try the previous time but otherwise each choice is equally likely.

2000 Paper 1 Q13
D: 1484.0 B: 1484.7

Every person carries two genes which can each be either of type \(A\) or of type \(B\). It is known that \(81\%\) of the population are \(AA\) (i.e. both genes are of type \(A\)), \(18\%\) are \(AB\) (i.e. there is one gene of type \(A\) and one of type \(B\)) and \(1\%\) are \(BB\). A child inherits one gene from each of its parents. If one parent is \(AA\), the child inherits a gene of type \(A\) from that parent; if the parent is \(BB\), the child inherits a gene of type \(B\) from that parent; if the parent is \(AB\), the inherited gene is equally likely to be \(A\) or \(B\).

  1. Given that two \(AB\) parents have four children, show that the probability that two of them are \(AA\) and two of them are \(BB\) is \(3/128\).
  2. My mother is \(AB\) and I am \(AA\). Find the probability that my father is \(AB\).

2000 Paper 2 Q12
D: 1600.0 B: 1487.4

Tabulated values of \({\Phi}(\cdot)\), the cumulative distribution function of a standard normal variable, should not be used in this question. Henry the commuter lives in Cambridge and his working day starts at his office in London at 0900. He catches the 0715 train to King's Cross with probability \(p\), or the 0720 to Liverpool Street with probability \(1-p\). Measured in minutes, journey times for the first train are \(N(55,25)\) and for the second are \(N(65,16)\). Journey times from King's Cross and Liverpool Street to his office are \(N(30,144)\) and \(N(25,9)\), respectively. Show that Henry is more likely to be late for work if he catches the first train. Henry makes \(M\) journeys, where \(M\) is large. Writing \(A\) for \(1-{\Phi}(20/13)\) and \(B\) for \(1-{\Phi}(2)\), find, in terms of \(A\), \(B\), \(M\) and \(p\), the expected number, \(L\), of times that Henry will be late and show that for all possible values of \(p\), $$BM \le L \le AM.$$ Henry noted that in 3/5 of the occasions when he was late, he had caught the King's Cross train. Obtain an estimate of \(p\) in terms of \(A\) and \(B\). [A random variable is said to be \(N\left({{\mu}, {\sigma}^2}\right)\) if it has a normal distribution with mean \({\mu}\) and variance \({\sigma}^2\).]


Solution: If Henry catches the first train, his journey time is \(N(55+30,25+144) = N(85,13^2)\). He is on time if the journey takes less than \(105\) minutes, \(\frac{20}{13}\) std above the mean. If he catches the second train, his journey times is \(N(65+25, 16+9) = N(90, 5^2)\). He is on time if his journey takes less than \(80\) minutes, ie \(\frac{10}{5} = 2\) standard deviations above the mean. This is more likely than from the first train. \(A = 1 - \Phi(20/13)\) is the probability he is late from the first train. \(B = 1 - \Phi(2)\) is the probability he is late from the second train. The expected number of lates is \(L = M \cdot p \cdot A + M \cdot (1-p) \cdot B\), since \(B \leq A\) we must have \(BM \leq L \leq AM\) \begin{align*} && \frac35 &= \frac{pA}{pA + (1-p)B} \\ \Rightarrow && 3(1-p)B &= 2pA \\ \Rightarrow && p(2A+3B) &= 3B \\ \Rightarrow && p &= \frac{3B}{2A+3B} \end{align*}

2000 Paper 3 Q12
D: 1700.0 B: 1553.7

In a lottery, any one of \(N\) numbers, where \(N\) is large, is chosen at random and independently for each player by machine. Each week there are \(2N\) players and one winning number is drawn. Write down an exact expression for the probability that there are three or fewer winners in a week, given that you hold a winning ticket that week. Using the fact that $$ {\biggl( 1 - {a \over n} \biggr) ^n \approx \e^{-a}}$$ for \(n\) much larger than \(a\), or otherwise, show that this probability is approximately \({2 \over 3}\) . Discuss briefly whether this probability would increase or decrease if the numbers were chosen by the players. Show that the expected number of winners in a week, given that you hold a winning ticket that week, is \( 3-N^{-1}\).

1999 Paper 1 Q14
D: 1500.0 B: 1516.0

When I throw a dart at a target, the probability that it lands a distance \(X\) from the centre is a random variable with density function \[ \mathrm{f}(x)=\begin{cases} 2x & \text{ if }0\leqslant x\leqslant1;\\ 0 & \text{ otherwise.} \end{cases} \] I score points according to the position of the dart as follows: %

%
%
%\newline\hspace*{10mm} if~\(0\le X< \frac14\), my score is 4; %\newline\hspace*{10mm} if~\(\frac14\le X< \frac12\), my score is 3; %\newline\hspace*{10mm} if \(\frac12\le X< \frac34\), my score is 2; %\newline\hspace*{10mm} if \(\frac34\le X\le 1\), my score is 1.
  1. Show that my expected score from one dart is 15/8.
  2. I play a game with the following rules. I start off with a total score 0, and each time~I throw a dart my score on that throw is added to my total. Then: \newline \hspace*{10mm} if my new total is greater than 3, I have lost and the game ends; \newline \hspace*{10mm} if my new total is 3, I have won and the game ends; \newline \hspace*{10mm} if my new total is less than 3, I throw again. Show that, if I have won such a game, the probability that I threw the dart three times is 343/2231.

1999 Paper 2 Q12
D: 1600.0 B: 1484.0

It is known that there are three manufacturers \(A, B, C,\) who can produce micro chip MB666. The probability that a randomly selected MB666 is produced by \(A\) is \(2p\), and the corresponding probabilities for \(B\) and \(C\) are \(p\) and \(1 - 3p\), respectively, where \({{0} \le p \le {1 \over 3}}.\) It is also known that \(70\%\) of MB666 micro chips from \(A\) are sound and that the corresponding percentages for \(B\) and \(C\) are \(80\%\) and \(90\%\), respectively. Find in terms of \(p\), the conditional probability, \(\P(A {\vert} S)\), that if a randomly selected MB666 chip is found to be sound then it came from \(A\), and also the conditional probability, \(\P(C {\vert} S)\), that if it is sound then it came from \(C\). A quality inspector took a random sample of one MB666 micro chip and found it to be sound. She then traced its place of manufacture to be \(A\), and so estimated \(p\) by calculating the value of \(p\) that corresponds to the greatest value of \(\P(A {\vert} S)\). A second quality inspector also a took random sample of one MB666 chip and found it to be sound. Later he traced its place of manufacture to be \(C\) and so estimated \(p\) by applying the procedure of his colleague to \(\P(C {\vert} S)\). Determine the values of the two estimates and comment briefly on the results obtained.

1998 Paper 1 Q13
D: 1484.0 B: 1532.0

I have a bag initially containing \(r\) red fruit pastilles (my favourites) and \(b\) fruit pastilles of other colours. From time to time I shake the bag thoroughly and remove a pastille at random. (It may be assumed that all pastilles have an equal chance of being selected.) If the pastille is red I eat it but otherwise I replace it in the bag. After \(n\) such drawings, I find that I have only eaten one pastille. Show that the probability that I ate it on my last drawing is \[\frac{(r+b-1)^{n-1}}{(r+b)^{n}-(r+b-1)^{n}}.\]

1998 Paper 2 Q13
D: 1600.0 B: 1516.0

A random variable \(X\) has the probability density function \[ \mathrm{f}(x)=\begin{cases} \lambda\mathrm{e}^{-\lambda x} & x\geqslant0,\\ 0 & x<0. \end{cases} \] Show that $${\rm P}(X>s+t\,\vert X>t) = {\rm P}(X>s).$$ The time it takes an assistant to serve a customer in a certain shop is a random variable with the above distribution and the times for different customers are independent. If, when I enter the shop, the only two assistants are serving one customer each, what is the probability that these customers are both still being served at time \(t\) after I arrive? One of the assistants finishes serving his customer and immediately starts serving me. What is the probability that I am still being served when the other customer has finished being served?


Solution: \begin{align*} && \mathbb{P}(X > t) &= \int_t^{\infty} \lambda e^{-\lambda x} \d x\\ &&&= \left[ -e^{-\lambda x} \right]_t^\infty \\ &&&= e^{-\lambda t}\\ \\ && \mathbb{P}(X > s + t | X > t) &= \frac{\mathbb{P}(X > s + t)}{\mathbb{P}(X > t)} \\ &&&= \frac{e^{-(s+t)\lambda}}{e^{-t\lambda}} \\ &&&= e^{-s\lambda} = \mathbb{P}(X > s) \end{align*} The probability both are still being served (independently) is \(\mathbb{P}(X > t)^2 = e^{-2\lambda t}\). The probability is exactly \(\frac12\). The property we proved in the first part of the questions shows the distribution is memoryless, ie we are both experiencing samples from the same distribution. Therefore we are equally likely to finish first.

1998 Paper 3 Q13
D: 1700.0 B: 1500.0

Write down the probability of obtaining \(k\) heads in \(n\) tosses of a fair coin. Now suppose that \(k\) is known but \(n\) is unknown. A maximum likelihood estimator (MLE) of \(n\) is defined to be a value (which must be an integer) of \(n\) which maximizes the probability of \(k\) heads. A friend has thrown a fair coin a number of times. She tells you that she has observed one head. Show that in this case there are two MLEs of the number of tosses she has made. She now tells you that in a repeat of the exercise she has observed \(k\) heads. Find the two MLEs of the number of tosses she has made. She next uses a coin biased with probability \(p\) (known) of showing a head, and again tells you that she has observed \(k\) heads. Find the MLEs of the number of tosses made. What is the condition for the MLE to be unique?


Solution: \begin{align*} && \mathbb{P}(k \text{ heads} | n\text{ tosses}) &= \binom{n}k 2^{-n} \\ && \mathbb{P}(1 \text{ head} | n\text{ tosses}) &= n2^{-n} \\ \Rightarrow && \frac{ \mathbb{P}(1 \text{ head} | n+1\text{ tosses}) }{ \mathbb{P}(1 \text{ head} | n\text{ tosses}) } &= \frac{n+1}{2n} \end{align*} Which is less than \(1\) unless \(n \geq 1\). Therefore the MLE is \(n = 1\) or \(n= 2\). \begin{align*} \frac{ \mathbb{P}(k \text{ head} | n+1\text{ tosses}) }{ \mathbb{P}(k \text{ head} | n\text{ tosses}) } &= \frac{\binom{n+1}{k}}{2 \binom{n}{k}} \\ &= \frac{(n+1)!(n-k)!}{2n!(n+1-k)!} \\ &= \frac{n+1}{2(n+1-k)} \end{align*} This is less than or equal to \(1\) if \(n+1 = 2(n+1-k) \Leftrightarrow n= 2k-1\), therefore the MLEs are \(2k-1\) and \(2k\). If the coin is biased, we have \begin{align*} && \frac{ \mathbb{P}(k \text{ head} | n+1\text{ tosses}) }{ \mathbb{P}(k \text{ head} | n\text{ tosses}) } &= \frac{\binom{n+1}{k}p^kq^{n+1-k}}{\binom{n}{k}p^kq^{n-k}} \\ &&&= \frac{n+1}{(n+1-k)}q \\ \\ && 1 & \geq \frac{n+1}{(n+1-k)}q \\ \Leftrightarrow && (n+1)(1-q) &\geq k \\ \Leftrightarrow && n+1 & \geq \frac{k}{p} \end{align*} Therefore the probability is increasing until \(n+1 \geq \frac{k}{p}\). If \(\frac{k}p\) is an integer the MLEs are \(\frac{k}{p}-1\) and \(\frac{k}p\), otherwise it is \(\lfloor \frac{k}{p} \rfloor\) and the MLE is unique.

1997 Paper 2 Q13
D: 1600.0 B: 1516.0

A needle of length two cm is dropped at random onto a large piece of paper ruled with parallel lines two cm apart.

  1. By considering the angle which the needle makes with the lines, find the probability that the needle crosses the nearest line given that its centre is \(x\) cm from it, where \(0 < x < 1\).
  2. Given that the centre of the needle is \(x\) cm from the nearest line and that the needle crosses that line, find the cumulative distribution function for the length of the shorter segment of the needle cut off by the line.
  3. Find the probability that the needle misses all the lines.


Solution:

  1. Suppose the needle's center is \(x\) cm from the nearest line and makes an angle of \(\theta\). Then if \(\sin \theta > x\) it will cross the line, otherwise it will not. Given that \(\theta \sim U(0, \frac{\pi}{2})\), we can see that \begin{align*} && \mathbb{P}(\text{needle crosses}) &= \mathbb{P}(\sin \theta > x) \\ &&&= \mathbb{P}(\theta > \sin^{-1} x) \\ &&&= 1-\frac{2\sin^{-1} x}{\pi} \end{align*}
  2. The length of the short segment is \(L = 1 - \frac{x}{\sin \theta}\) and \(\theta \sim U(\sin^{-1} x, \frac{\pi}{2})\). So \begin{align*} && F_L(l) &= \mathbb{P}(L < l) \\ &&&= \mathbb{P}\left (1 - \frac{x} {\sin \theta} < l\right) \\ &&&= \mathbb{P}\left ( \sin \theta < \frac{x}{1-l}\right) \\ &&&= \mathbb{P}\left (\theta < \sin^{-1} \frac{x}{1-l}\right) \\ &&&= \frac{ \sin^{-1} \frac{x}{1-l} - \sin^{-1} x }{\frac{\pi}{2} - \sin^{-1}x} \end{align*}
  3. The needle (with probability \(1\)) cannot hit \(2\) lines, so let's only consider the line it's nearest too. The distance to this line is uniform on \([0,1]\), and the so we want to calculate. \begin{align*} && \mathbb{P}(\text{needle crosses}) &= \int_0^1 \left (1 - \frac{2\sin^{-1}x}{\pi} \right) \d x \\ &&&= 1 - \frac{2}{\pi} \int_0^1 \sin^{-1} x \d x\\ &&&= 1 - \frac{2}{\pi} \left ( \frac{\pi}{2} - 1 \right) \\ &&&= \frac{2}{\pi} \end{align*} Therefore the probability it misses is \(1 - \frac{\pi}{2}\)

1997 Paper 2 Q14
D: 1600.0 B: 1469.6

Traffic enters a tunnel which is 9600 metres long, and in which overtaking is impossible. The number of vehicles which enter in any given time is governed by the Poisson distribution with mean 6 cars per minute. All vehicles travel at a constant speed until forced to slow down on catching up with a slower vehicle ahead. I enter the tunnel travelling at 30 m\(\,\)s\(^{-1}\) and all the other traffic is travelling at 32 m\(\,\)s\(^{-1}\). What is the expected number of vehicles in the queue behind me when I leave the tunnel? Assuming again that I travel at 30 m\(\,\)s\(^{-1}\), but that all the other vehicles are independently equally likely to be travelling at 30 m\(\,\)s\(^{-1}\) or 32 m\(\,\)s\(^{-1}\), find the probability that exactly two vehicles enter the tunnel within 20 seconds of my doing so and catch me up before I leave it. Find also the probability that there are exactly two vehicles queuing behind me when I leave the tunnel. \noindent [Ignore the lengths of the vehicles.]

1996 Paper 1 Q14
D: 1484.0 B: 1484.0

A biased coin, with a probability \(p\) of coming up heads and a probability \(q=1-p\) of coming up tails, is tossed repeatedly. Let \(A\) be the event that the first run of \(r\) successive heads occurs before the first run of \(s\) successive tails. If \(H\) is the even that on the first toss the coin comes up heads and \(T\) is the event that it comes up tails, show that \begin{alignat*}{1} \mathrm{P}(A|H) & =p^{\alpha}+(1-p^{\alpha})\mathrm{P}(A|T),\\ \mathrm{P}(A|T) & =(1-q^{\beta})\mathrm{P}(A|H), \end{alignat*} where \(\alpha\) and \(\beta\) are to be determined. Use these two equations to find \(\mathrm{P}(A|H),\) \(\mathrm{P}(A|T),\) and hence \(\mathrm{P}(A).\)


Solution: \begin{align*} && \P(A|H) &= \P(\text{achieve }r\text{ heads immediately}) + \P(\text{don't and then achieve it from having flipped a tail}) \\ &&&= p^{r-1} + (1-p^{r-1}) \cdot \P(A|T) \\ && \P(A|T) &= (1-q^{s-1})\P(A|H) \\ \\ &&\P(A|H) &= p^{r-1}+(1-p^{r-1})(1-q^{s-1})\P(A|H) \\ \Rightarrow && \P(A|H) &= \frac{p^{r-1}}{1-(1-p^{r-1})(1-q^{s-1})} \\ && \P(A|T) &= \frac{(1-q^{s-1})p^{r-1}}{1-(1-p^{r-1})(1-q^{s-1})} \\ && \P(A) &= \frac{(2-q^{s-1})p^{r-1}}{2(1-(1-p^{r-1})(1-q^{s-1}))} \end{align*}

1996 Paper 3 Q12
D: 1700.0 B: 1554.3

It has been observed that Professor Ecks proves three types of theorems: 1, those that are correct and new; 2, those that are correct, but already known; 3, those that are false. It has also been observed that, if a certain of her theorems is of type \(i\), then her next theorem is of type \(j\) with probability \(p\low_{ij},\) where \(p\low_{ij}\) is the entry in the \(i\)th row and \(j\)th column of the following array: \[ \begin{pmatrix}0.3 & 0.3 & 0.4\\ 0.2 & 0.4 & 0.4\\ 0.1 & 0.3 & 0.6 \end{pmatrix}\,. \] Let \(a_{i},\) \(i=1,2,3\), be the probability that a given theorem is of type \(i\), and let \(b_{j}\) be the consequent probability that the next theorem is of type \(j\).

  1. Explain why \(b_{j}=a\low_{1}p\low_{1j}+a\low_{2}p\low_{2j}+a\low_{3}p\low_{3j}\,.\)
  2. Find values of \(a\low_{1},a\low_{2}\) and \(a\low_{3}\) such that \(b_{i}=a_{i}\) for \(i=1,2,3.\)
  3. For these values of the \(a_{i}\) find the probabilities \(q\low_{ij}\) that, if a particular theorem is of type \(j\), then the \textit{preceding }theorem was of type \(i\).