Problems

Filters
Clear Filters

43 problems found

1996 Paper 2 Q12
D: 1600.0 B: 1500.0

  1. Let \(X_{1}, X_{2}, \dots, X_{n}\) be independent random variables each of which is uniformly distributed on \([0,1]\). Let \(Y\) be the largest of \(X_{1}, X_{2}, \dots, X_{n}\). By using the fact that \(Y<\lambda\) if and only if \(X_{j}<\lambda\) for \(1\leqslant j\leqslant n\), find the probability density function of \(Y\). Show that the variance of \(Y\) is \[\frac{n}{(n+2)(n+1)^{2}}.\]
  2. The probability that a neon light switched on at time \(0\) will have failed by a time \(t>0\) is \(1-\mathrm{e}^{-t/\lambda}\) where \(\lambda>0\). I switch on \(n\) independent neon lights at time zero. Show that the expected time until the first failure is \(\lambda/n\).


Solution:

  1. \(\,\) \begin{align*} && F_Y(\lambda) &= \mathbb{P}(Y < \lambda) \\ &&&= \prod_i \mathbb{P}(X_i < \lambda) \\ &&&= \lambda^n \\ \Rightarrow && f_Y(\lambda) &= \begin{cases} n \lambda^{n-1} & \text{if } 0 \leq \lambda \leq 1 \\ 0 & \text{otherwise} \end{cases} \\ \\ && \E[Y] &= \int_0^1 \lambda f_Y(\lambda) \d \lambda \\ &&&= \int_0^1 n \lambda^n \d \lambda \\ &&&= \frac{n}{n+1} \\ && \E[Y^2] &= \int_0^1 \lambda^2 f_Y(\lambda) \d \lambda \\ &&&= \int_0^1 n \lambda^{n+1} \d \lambda \\ &&&= \frac{n}{n+2} \\ \Rightarrow && \var[Y] &= \E[Y^2]-(\E[Y])^2 \\ &&&= \frac{n}{n+2} - \frac{n^2}{(n+1)^2} \\ &&&= \frac{(n+1)^2n-n^2(n+2)}{(n+2)(n+1)^2} \\ &&&= \frac{n[(n^2+2n+1)-(n^2+2n)]}{(n+2)(n+1)^2} \\ &&&= \frac{n}{(n+2)(n+1)^2} \end{align*}
  2. Using the same reasoning, we can see that \begin{align*} && 1-F_Z(t) &= \mathbb{P}(\text{all lights still on after t}) \\ &&&= \prod_i e^{-t/\lambda} \\ &&&= e^{-nt/\lambda} \\ \\ \Rightarrow && F_Z(t) &= 1-e^{-nt/\lambda} \end{align*} Therefore \(Z \sim Exp(\frac{n}{\lambda})\) and the time to first failure is \(\lambda/n\)

1995 Paper 2 Q14
D: 1600.0 B: 1500.0

Suppose \(X\) is a random variable with probability density \[ \mathrm{f}(x)=Ax^{2}\exp(-x^{2}/2) \] for \(-\infty < x < \infty.\) Find \(A\). You belong to a group of scientists who believe that the outcome of a certain experiment is a random variable with the probability density just given, while other scientists believe that the probability density is the same except with different mean (i.e. the probability density is \(\mathrm{f}(x-\mu)\) with \(\mu\neq0\)). In each of the following two cases decide whether the result given would shake your faith in your hypothesis, and justify your answer.

  1. A single trial produces the result 87.3.
  2. 1000 independent trials produce results having a mean value \(0.23.\)
{[}Great weight will be placed on clear statements of your reasons and none on the mere repetition of standard tests, however sophisticated, if unsupported by argument. There are several possible approaches to this question. For some of them it is useful to know that if \(Z\) is normal with mean 0 and variance 1 then \(\mathrm{E}(Z^{4})=3.\){]}


Solution: Let \(Z \sim N(0,1)\), with a pdf of \(f(x) = \frac{1}{\sqrt{2\pi}} \exp(-x^2/2)\) \begin{align*} && 1 &= \int_{-\infty}^\infty Ax^2 \exp(-x^2/2) \d x \\ &&&= A\sqrt{2\pi} \int_{-\infty}^\infty x^2 \frac{1}{\sqrt{2\pi}} \exp(-x^2/2) \d x \\ &&&= A\sqrt{2\pi} \E[Z^2] = A\sqrt{2\pi} \\ \Rightarrow && A &= \frac{1}{\sqrt{2\pi}} \end{align*}

  1. The probability of seeing a result as extreme as \(87.3\) is \begin{align*} \mathbb{P}(X > 87.3) &= \frac{1}{\sqrt{2\pi}}\int_{87.3}^{\infty} x^2 \exp(-x^2/2) \d x \\ &= \left [ -\frac{1}{\sqrt{2\pi}}x \exp(-x^2/2)\right]_{87.3}^{\infty}+\int_{87.3}^{\infty}\frac{1}{\sqrt{2\pi}} \exp(-x^2/2) \d x \\ &\approx 0 +(1- \Phi(87.3)) \\ &\approx 0 \end{align*} It is very unlikely this data point has come from our distribution rather than one with a higher mean, therefore our faith is very shaken.
  2. If there are 1000 trials of this, we would expect the sample mean to be distributed according to the CLT. Each sample has mean \(0\) and variance \(\E[X^2] = \int_{-\infty}^\infty x^4 \frac{1}{\sqrt{2\pi}} \exp(-x^2/2) \d x = \E[Z^4] = 3\), therefore the sample mean is \(N(0, 3/1000)\). Therefore the probability of being \(0.23\) away is \begin{align*} && \mathbb{P}(S > 0.23) &= \mathbb{P}\left (Z > \frac{0.23}{\sqrt{3/1000}} \right) \\ &&&= \mathbb{P}\left (Z > \frac{0.23}{\sqrt{30}/100} \right) \\ &&&\approx \mathbb{P}\left (Z > \frac{0.23}{0.055} \right) \\ &&& \approx 0 \end{align*} again our faith should be shaken

1995 Paper 3 Q14
D: 1700.0 B: 1516.0

A candidate finishes examination questions in time \(T\), where \(T\) has probability density function \[ \mathrm{f}(t)=t\mathrm{e}^{-t}\qquad t\geqslant0, \] the probabilities for the various questions being independent. Find the moment generating function of \(T\) and hence find the moment generating function for the total time \(U\) taken to finish two such questions. Show that the probability density function for \(U\) is \[ \mathrm{g}(u)=\frac{1}{6}u^{3}\mathrm{e}^{-u}\qquad u\geqslant0. \] Find the probability density function for the total time taken to answer \(n\) such questions.


Solution: \begin{align*} && M_T(x) &= \mathbb{E}[e^{xT}] \\ &&&= \int_0^{\infty} e^{xt}te^{-t} \d t \\ &&&= \int_0^{\infty}te^{(x-1)t} \d t \\ &&&= \left [ \frac{t}{x-1} e^{(x-1)t} \right]_0^{\infty} - \int_0^\infty \frac{e^{(x-1)t}}{x-1} \d t \\ &&&= \left [ \frac{e^{(x-1)t}}{(x-1)^2} \right]_0^{\infty} \\ &&&= \frac{1}{(x-1)^2} \\ \\ && M_U(x) &= M_{T_1+T_2}(x) \\ &&&= \frac1{(x-1)^4} \\ \\ && I_n &= \int_0^{\infty} t^ne^{(x-1)t} \d t \\ &&&= \left[ \frac{1}{(x-1)}t^ne^{(x-1)t} \right]_0^{\infty} - \frac{n}{(x-1)} \int_0^{\infty}t^{n-1}e^{(x-1)t} \d t \\ &&&= -\frac{n}{(x-1)}I_{n-1} \\ \Rightarrow && I_n &= \frac{n!}{(1-x)^{n+1}} \\ \\ \Rightarrow && \int_0^{\infty} e^{xt} \frac16u^3e^{-u} \d u &= \int_0^{\infty} \frac16u^3e^{(x-1)u} \d u \\ &&&= \frac{1}{(1-x)^4} \\ \Rightarrow && f_U(u) &= \frac16u^3e^{-u} \\ \\ && M_{X_1+\cdots+X_n}(x) &= \frac{1}{(x-1)^{2n}} \\ \Rightarrow && f_{X_1+\cdots+X_n}(t) &= \frac1{(2n-1)!} t^{2n-1}e^{-t} \end{align*} (NB: This is the gamma distribution)

1994 Paper 3 Q13
D: 1700.0 B: 1484.0

During his performance a trapeze artist is supported by two identical ropes, either of which can bear his weight. Each rope is such that the time, in hours of performance, before it fails is exponentially distributed, independently of the other, with probability density function \(\lambda\exp(-\lambda t)\) for \(t\geqslant0\) (and 0 for \(t < 0\)), for some \(\lambda > 0.\) A particular rope has already been in use for \(t_{0}\) hours of performance. Find the distribution for the length of time the artist can continue to use it before it fails. Interpret and comment upon your result. Before going on tour the artist insists that the management purchase two new ropes of the above type. Show that the probability density function of the time until both ropes fail is \[ \mathrm{f}(t)=\begin{cases} 2\lambda\mathrm{e}^{-\lambda t}(1-\mathrm{e}^{-\lambda t}) & \text{ if }t\geqslant0,\\ 0 & \text{ otherwise.} \end{cases} \] If each performance lasts for \(h\) hours, find the probability that both ropes fail during the \(n\)th performance. Show that the probability that both ropes fail during the same performance is \(\tanh(\lambda h/2)\).


Solution: This is the memoryless property of the exponential distribution so it has the same distribution as when \(t = 0\). Let \(T\) be the time the rope fails, then \begin{align*} && \mathbb{P}(T > t | T > t_0) &= \frac{\mathbb{P}(T > t)}{\mathbb{P}(T > t_0)} \\ &&&= \frac{e^{-\lambda t}}{e^{-\lambda t_0}} \\ &&&= e^{-\lambda(t-t_0)} \end{align*} This means that each rope (as long as it hasn't broken) can be considered "as good as new". Suppose \(T_1, T_2 \sim Exp(\lambda)\) are the time to failures for each rope, then \begin{align*} && \mathbb{P}(\max(T_1, T_2) < t) &= \mathbb{P}(T_1 < t, T_2 < t) \\ &&&= (1-e^{-\lambda t})^2 \\ \Rightarrow && f(t) &= 2(1-e^{-\lambda t}) \cdot (\lambda e^{-\lambda t}) \\ &&&= 2\lambda e^{-\lambda t}(1-e^{-\lambda t}) \end{align*} Therefore \(\max(T_1, T_2) \sim Exp(2\lambda)\) and the pdf is as described. \begin{align*} && \mathbb{P}(\text{both fail during the }n\text{th}) &= \left ( \int_{(n-1)h}^{nh} \lambda e^{-\lambda t} \d t \right)^2 \\ &&&=\left (\left [ -e^{-\lambda t}\right]_{(n-1)h}^{nh} \right)^2 \\ &&&= \left ( e^{-\lambda (n-1)h}( 1-e^{-\lambda h}) \right)^2 \\ &&&= e^{-2(n-1)h\lambda}(e^{-\lambda h}-1)^2 \\ \\ && \mathbb{P}(\text{both fail in same performance}) &= \sum_{n=1}^{\infty} \mathbb{P}(\text{both fail during the }n\text{th}) \\ &&&= \sum_{n=1}^{\infty}e^{-2(n-1)h\lambda}(e^{-\lambda h}-1)^2 \\ &&&= (e^{-\lambda h}-1)^2 \frac{1}{1-e^{-2h\lambda}} \\ &&&= \frac{e^{-\lambda h}-1}{1+e^{-h\lambda}} \\ &&&= \tanh(\lambda h/2) \end{align*}

1993 Paper 1 Q14
D: 1500.0 B: 1505.6

When he sets out on a drive Mr Toad selects a speed \(V\) kilometres per minute where \(V\) is a random variable with probability density \[ \alpha v^{-2}\mathrm{e}^{-\alpha v^{-1}} \] and \(\alpha\) is a strictly positive constant. He then drives at constant speed, regardless of other drivers, road conditions and the Highway Code. The traffic lights at the Wild Wood cross-roads change from red to green when Mr Toad is exactly 1 kilometre away in his journey towards them. If the traffic light is green for \(g\) minutes, then red for \(r\) minutes, then green for \(g\) minutes, and so on, show that the probability that he passes them after \(n(g+r)\) minutes but before \(n(g+r)+g\) minutes, where \(n\) is a positive integer, is \[ \mathrm{e}^{-\alpha n(g+r)}-\mathrm{e}^{-\alpha\left(n(g+r)\right)+g}. \] Find the probability \(\mathrm{P}(\alpha)\) that he passes the traffic lights when they are green. Show that \(\mathrm{P}(\alpha)\rightarrow1\) as \(\alpha\rightarrow\infty\) and, by noting that \((\mathrm{e}^{x}-1)/x\rightarrow1\) as \(x\rightarrow0\), or otherwise, show that \[ \mathrm{P}(\alpha)\rightarrow\frac{g}{r+g}\quad\mbox{ as }\alpha\rightarrow0. \] {[}NB: the traffic light show only green and red - not amber.{]}

1993 Paper 3 Q16
D: 1700.0 B: 1484.9

The time taken for me to set an acceptable examination question it \(T\) hours. The distribution of \(T\) is a truncated normal distribution with probability density \(\f\) where \[ \mathrm{f}(t)=\begin{cases} \dfrac{1}{k\sigma\sqrt{2\pi}}\exp\left(-\dfrac{1}{2}\left(\dfrac{t-\sigma}{\sigma}\right)^{2}\right) & \mbox{ for }t\geqslant0\\ 0 & \mbox{ for }t<0. \end{cases} \] Sketch the graph of \(\f(t)\). Show that \(k\) is approximately \(0.841\) and obtain the mean of \(T\) as a multiple of \(\sigma\). Over a period of years, I find that the mean setting time is 3 hours.

  1. Find the approximate probability that none of the 16 questions on next year's paper will take more than 4 hours to set.
  2. Given that a particular question is unsatisfactory after 2 hours work, find the probability that it will still be unacceptable after a further 2 hours work.

1992 Paper 3 Q15
D: 1700.0 B: 1500.0

A goat \(G\) lies in a square field \(OABC\) of side \(a\). It wanders randomly round its field, so that at any time the probability of its being in any given region is proportional to the area of this region. Write down the probability that its distance, \(R\), from \(O\) is less than \(r\) if \(0 < r\leqslant a,\) and show that if \(r\geqslant a\) the probability is \[ \left(\frac{r^{2}}{a^{2}}-1\right)^{\frac{1}{2}}+\frac{\pi r^{2}}{4a^{2}}-\frac{r^{2}}{a^{2}}\cos^{-1}\left(\frac{a}{r}\right). \] Find the median of \(R\) and probability density function of \(R\). The goat is then tethered to the corner \(O\) by a chain of length \(a\). Find the conditional probability that its distance from the fence \(OC\) is more than \(a/2\).

1990 Paper 1 Q16
D: 1500.0 B: 1486.1

A bus is supposed to stop outside my house every hour on the hour. From long observation I know that a bus will always arrive some time between 10 minutes before and ten minutes after the hour. The probability it arrives at a given instant increases linearly (from zero at 10 minutes before the hour) up to a maximum value at the hour, and then decreases linearly at the same rate after the hour. Obtain the probability density function of \(T\), the time in minutes after the scheduled time at which a bus arrives. If I get up when my alarm clock goes off, I arrive at the bus stop at 7.55am. However, with probability 0.5, I doze for 3 minutes before it rings again. In that case with probability 0.8 I get up then and reach the bus stop at 7.58am, or, with probability 0.2, I sleep a little longer, not reaching the stop until 8.02am. What is the probability that I catch a bus by 8.10am? I buy a louder alarm clock which ensures that I reach the stop at exactly the same time each morning. This clock keeps perfect time, but may be set to an incorrect time. If it is correct, the alarm goes off so that I should reach the stop at 7.55am. After 100 mornings I find that I have had to wait for a bus until after 9am (according to the new clock) on 5 occasions. Is this evidence that the new clock is incorrectly set? {[}The time of arrival of different buses are independent of each other.{]}


Solution: The probability density function will look like a triangle with base \(20\) minutes and therefore height \(\frac{1}{10}\) per minute, ie: \begin{align*} f_T(t) &= \begin{cases} \frac{1}{100}(t+10) & \text{if } -10 \leq t \leq 0 \\ \frac{1}{100}(10-t) & \text{if } 0 \leq t \leq 10 \\ 0 & \text{otherwise} \end{cases} \end{align*} \begin{align*} \mathbb{P}(\text{catch bus}) &=0.5 \mathbb{P}(\text{bus arrives after 7:55})+0.4 \mathbb{P}(\text{bus arrives after 7:58}) + 0.1 \mathbb{P}(\text{bus arrives after 8:02}) \\ &= \frac12 \cdot \left (1 - \frac18 \right) + \frac{2}{5} \cdot \left ( 1 - \frac{4^2}{5^2} \cdot \frac{1}{2} \right) + \frac{1}{10} \cdot \frac{4^2}{5^2} \cdot \frac12 \\ &= \frac{1\,483}{2\,000} \\ &\approx 74\% \end{align*} \begin{align*} \mathbb{P}(\text{catch bus}) &= \mathbb{P}(\text{bus arrives after 7:55}) \mathbb{P}(\text{catch next bus by 9:00}) \\ &= \frac78 + \frac18 \cdot \frac12 \\ &= \frac{15}{16} \end{align*} He should expect to miss \(6.25\) buses, so missing \(5\) seems about right. (Using a binomial calculation, seeing 5 or fewer buses is ~\(40\%\) which isn't suspicious).

1990 Paper 2 Q15
D: 1600.0 B: 1500.0

A target consists of a disc of unit radius and centre \(O\). A certain marksman never misses the target, and the probability of any given shot hitting the target within a distance \(t\) from \(O\) it \(t^{2}\), where \(0\leqslant t\leqslant1\). The marksman fires \(n\) shots independently. The random variable \(Y\) is the radius of the smallest circle, with centre \(O\), which encloses all the shots. Show that the probability density function of \(Y\) is \(2ny^{2n-1}\) and find the expected area of the circle. The shot which is furthest from \(O\) is rejected. Show that the expected area of the smallest circle, with centre \(O\), which encloses the remaining \((n-1)\) shots is \[ \left(\frac{n-1}{n+1}\right)\pi. \]


Solution: Another way to describe \(Y\) is the maximum distance of any shot from \(O\). Let \(X_i\), \(1 \leq i \leq n\) be the \(n\) shots then, \begin{align*} F_Y(y) &= \mathbb{P}(Y \leq y) \\ &= \mathbb{P}(X_i \leq y \text{ for all } i) \\ &= \prod_{i=1}^n \mathbb{P}(X_i \leq y) \tag{each shot independent}\\ &= \prod_{i=1}^n y^2\\ &= y^{2n} \end{align*} Therefore \(f_Y(y) = \frac{\d}{\d y} (y^{2n}) = 2n y^{2n-1}\). \begin{align*} \mathbb{E}(\pi Y^2) &= \int_0^1\pi y^2 \f_Y(y) \d y \\ &=\pi \int_0^1 2n y^{2n+1} \d y \\ &=\left ( \frac{n}{n+1} \right )\pi \end{align*}. Let \(Z\) be the distance of the second furthest shot, then: \begin{align*} && F_Z(z) &= \mathbb{P}(Z \leq z) \\ &&&= \mathbb{P}(X_i \leq z \text{ for at least } n - 1\text{ different } i) \\ &&&= n\mathbb{P}(X_i \leq z \text{ for all but 1}) + \mathbb{P}(X_i \leq z \text{ for all } i) \\ &&&= n \left ( \prod_{i=1}^{n-1} \mathbb{P}(X_i \leq z) \right) \mathbb{P}(X_n > z) + z^{2n} \\ &&&= nz^{2n-2}(1-z^2) + z^{2n} \\ &&&= nz^{2n-2} -(n-1)z^{2n} \\ \Rightarrow && f_Z(z) &= n(2n-2)z^{2n-3}-2n(n-1)z^{2n-1} \\ \Rightarrow && \mathbb{E}(\pi Z^2) &= \int_0^1 \pi z^2 \left (n(2n-2)z^{2n-3}-2n(n-1)z^{2n-1} \right) \d z \\ &&&= \pi \left ( \frac{n(2n-2)}{2n} - \frac{2n(n-1)}{2n+2}\right) \\ &&&= \left ( \frac{n-1}{n+1} \right) \pi \end{align*}

1989 Paper 1 Q14
D: 1516.0 B: 1453.5

The prevailing winds blow in a constant southerly direction from an enchanted castle. Each year, according to an ancient tradition, a princess releases 96 magic seeds from the castle, which are carried south by the wind before falling to rest. South of the castle lies one league of grassy parkland, then one league of lake, then one league of farmland, and finally the sea. If a seed falls on land it will immediately grow into a fever tree. (Fever trees do not grow in water). Seeds are blown independently of each other. The random variable \(L\) is the distance in leagues south of the castle at which a seed falls to rest (either on land or water). It is known that the probability density function \(\mathrm{f}\) of \(L\) is given by \[ \mathrm{f}(x)=\begin{cases} \frac{1}{2}-\frac{1}{8}x & \mbox{ for }0\leqslant x\leqslant4,\\ 0 & \mbox{ otherwise.} \end{cases} \] What is the mean number of fever trees which begin to grow each year?

  1. The random variable \(Y\) is defined as the distance in leagues south of the castle at which a new fever tree grows from a seed carried by the wind. Sketch the probability density function of \(Y\), and find the mean of \(Y\).
  2. One year messengers bring the king the news that 23 new fever trees have grown in the farmland. The wind never varies, and so the king suspects that the ancient tradition have not been followed properly. Is he justified in his suspicions?


Solution: \begin{align*} \mathbb{P}(\text{fever tree grows}) &= \mathbb{P}(0 \leq L \leq 1) + \mathbb{P}(2 \leq L \leq 3) \\ &= \int_0^1 \frac12 -\frac18 x \d x + \int_2^3 \frac12 - \frac18 x \d x \\ &= \left [\frac12 x - \frac1{16}x^2 \right]_0^1+ \left [\frac12 x - \frac1{16}x^2 \right]_2^3 \\ &= \frac12 - \frac1{16}+\frac32-\frac9{16} - 1 + \frac{4}{16} \\ &= \frac58 \end{align*} The expected number of fever trees is just \(96 \cdot \frac58 = 60\).

  1. \(f_Y(t)\) must match the distribution for \(L\), but limited to the points we care about, therefore it should be: $f_Y(t) = \begin{cases} ( \frac45 - \frac15t ) & \text{if } t \in [0,1]\cup[2,3] \\ 0 & \text{otherwise} \end{cases}$
    TikZ diagram
    \begin{align*} \mathbb{E}(Y) &= \frac12 \cdot \frac15 (4 - \frac12)+\frac52 \cdot (1 - \frac15 (4 - \frac12)) \\ &= \frac12 \cdot \frac7{10} + \frac52 \cdot \frac3{10} \\ &= \frac{22}{20} \\ &= \frac{11}{10} \end{align*}
  2. Given the seeds are blown independently and the wind hasn't changed, it is reasonable to model the number of fever trees as \(B(96, \frac{5}{8})\), it is also acceptable to approximate this using a Normal distribution, ie \(N(60, 22.5)\), \(23\) is \(\frac{23-60}{\sqrt{22.5}}\) is a very negative number, so he should be extremely suspicious.

1989 Paper 2 Q15
D: 1600.0 B: 1484.0

Two points are chosen independently at random on the perimeter (including the diameter) of a semicircle of unit radius. What is the probability that exactly one of them lies on the diameter? Let the area of the triangle formed by the two points and the midpoint of the diameter be denoted by the random variable \(A\).

  1. Given that exactly one point lies on the diameter, show that the expected value of \(A\) is \(\left(2\pi\right)^{-1}\).
  2. Given that neither point lies on the diameter, show that the expected value of \(A\) is \(\pi^{-1}\). [You may assume that if two points are chosen at random on a line of length \(\pi\) units, the probability density function for the distance \(X\) between the two points is \(2\left(\pi-x\right)/\pi^{2}\) for \(0\leqslant x\leqslant\pi.\)]
Using these results, or otherwise, show that the expected value of \(A\) is \(\left(2+\pi\right)^{-1}\).


Solution:

  1. TikZ diagram
    \begin{align*} \mathbb{E}(A \mid \text{exactly one point on diameter}) &= \int_{-1}^1\int_0^\pi \frac12 (x-0)\cdot 1 \cdot \sin(\pi - \theta) \frac{1}{\pi} \d \theta \frac{1}{2} \d x \\ &= \int_{-1}^1\frac1{2\pi} x \d x \cdot \left [ -\cos \theta \right]_0^\pi \\ &= \frac{1}{2\pi} \end{align*}
  2. TikZ diagram
    \begin{align*} \mathbb{E}(A \mid \text{no point on diameter}) &= \int_0^{\pi} \frac12 \cdot 1 \cdot 1 \cdot \sin x \cdot 2(\pi - x)/\pi^2 \d x \\ &= \frac1{\pi^2} \int_0^\pi \sin x (\pi - x) \d x \\ &= \frac1{\pi^2} \int_0^\pi x\sin x \d x \\ &= \frac1{\pi^2} \left [ \sin x - x \cos x \right]_0^{\pi} \\ &= \frac{1}{\pi} \end{align*}
If both points lie on the diameter the area of the triangle is \(0\). Therefore: \begin{align*} \mathbb{E}(A) &= \frac{1}{2\pi} \mathbb{P}(\text{exactly one point on diameter}) + \frac{1}{\pi}\mathbb{P}(\text{no points on diameter}) \\ &= \frac1{2\pi} \cdot \left (2 \cdot \frac{2}{2+\pi} \cdot \frac{\pi}{2+\pi} \right) + \frac{1}{\pi} \cdot \left ( \frac{\pi}{2+\pi} \cdot \frac{\pi}{2+\pi}\right) \\ &= \frac{1}{\pi} \frac{2\pi + \pi^2}{(2+\pi)^2} \\ &= \frac{1}{2+\pi} \end{align*}

1989 Paper 3 Q15
D: 1700.0 B: 1503.8

The continuous random variable \(X\) is uniformly distributed over the interval \([-c,c].\) Write down expressions for the probabilities that:

  1. \(n\) independently selected values of \(X\) are all greater than \(k\),
  2. \(n\) independently selected values of \(X\) are all less than \(k\),
where \(k\) lies in \([-c,c]\). A sample of \(2n+1\) values of \(X\) is selected at random and \(Z\) is the median of the sample. Show that \(Z\) is distributed over \([-c,c]\) with probability density function \[ \frac{(2n+1)!}{(n!)^{2}(2c)^{2n+1}}(c^{2}-z^{2})^{n}. \] Deduce the value of \({\displaystyle \int_{-c}^{c}(c^{2}-z^{2})^{n}\,\mathrm{d}z.}\) Evaluate \(\mathrm{E}(Z)\) and \(\mathrm{var}(Z).\)


Solution:

  1. \begin{align*} \mathbb{P}(n\text{ independent values of }X > k) &= \prod_{i=1}^n \mathbb{P}(X > k) \\ &= \left ( \frac{c-k}{2c}\right)^n \end{align*}
  2. \begin{align*} \mathbb{P}(n\text{ independent values of }X < k) &= \prod_{i=1}^n \mathbb{P}(X < k) \\ &= \left ( \frac{k+c}{2c}\right)^n \end{align*}
\begin{align*} &&\mathbb{P}(\text{median} < z+\delta \text{ and median} > z - \delta) &= \mathbb{P}(n\text{ values } < z - \delta \text{ and } n \text{ values} > z + \delta) \\ &&&= \binom{2n+1}{n,n,1} \left ( \frac{c-(z+\delta)}{2c}\right)^n\left ( \frac{(z-\delta)+c}{2c}\right)^n \frac{2 \delta}{2 c} \\ &&&= \frac{(2n+1)!}{n! n!} \frac{((c-(z+\delta))(c+(z-\delta)))^n 2\delta}{2^n c^n} \\ &&&= \frac{(2n+1)!}{(n!)^2 (2c)^{2n+1}}((c-(z+\delta))(c+(z-\delta)))^n 2\delta \\ \Rightarrow && \lim_{\delta \to 0} \frac{\mathbb{P}(\text{median} < z+\delta \text{ and median} > z - \delta)}{2 \delta} &= \frac{(2n+1)!}{(n!)^2 (2c)^{2n+1}}((c-z)(c+z))^n \\ &&&= \frac{(2n+1)!}{(n!)^2 (2c)^{2n+1}}(c^2-z^2) \\ \end{align*} \begin{align*} && 1 &= \int_{-c}^c \frac{(2n+1)!}{(n!)^2 (2c)^{2n+1}}(c^2-z^2)^n \d z \\ \Rightarrow && \frac{(n!)^2 (2c)^{2n+1}}{(2n+1)!} &= \int_{-c}^c (c^2-z^2)^n \d z \end{align*} \begin{align*} \mathbb{E}(Z) &= \int_{-c}^c z \frac{(2n+1)!}{(n!)^2 (2c)^{2n+1}}(c^2-z^2)^n \d z \\ &=\frac{(2n+1)!}{(n!)^2 (2c)^{2n+1}} \int_{-c}^c z (c^2-z^2)^n \d z \\ &= 0 \end{align*} \begin{align*} \mathrm{Var}(Z) &= \mathbb{E}(Z^2) - \mathbb{E}(Z)^2 \\ &= \mathbb{E}(Z^2) \\ &= \int_{-c}^c z^2 \frac{(2n+1)!}{(n!)^2 (2c)^{2n+1}}(c^2-z^2)^n \d z \\ &=\frac{(2n+1)!}{(n!)^2 (2c)^{2n+1}} \int_{-c}^c z^2 (c^2-z^2)^n \d z \\ &=\frac{(2n+1)!}{(n!)^2 (2c)^{2n+1}} \left ( \left [ -\frac{1}{2(n+1)}z(c^2-z^2)^{n+1} \right]_{-c}^c + \frac{1}{2(n+1)}\int_{-c}^c (c^2-z^2)^{n+1} \d z \right) \\ &= \frac{(2n+1)!}{(n!)^2 (2c)^{2n+1}} \frac{1}{2(n+1)} \frac{((n+1)!)^2 (2c)^{2n+3}}{(2n+3)!} \\ &= \frac{(n+1)^2(2c)^2}{(n+1)(2n+2)(2n+3)} \\ &= \frac{2c^2}{2n+3} \end{align*}

1988 Paper 1 Q16
D: 1500.0 B: 1498.6

Wondergoo is applied to all new cars. It protects them completely against rust for three years, but thereafter the probability density of the time of onset of rust is proportional to \(t^{2}/(1+t^{2})^{2}\) for a car of age \(3+t\) years \((t\geqslant0)\). Find the probability that a car becomes rusty before it is \(3+t\) years old. Every car is tested for rust annually on the anniversary of its manufacture. If a car is not rusty, it will certainly pass; if it is rusty, it will pass with probability \(\frac{1}{2}.\) Cars which do not pass are immediately taken off the road and destroyed. What is the probability that a randomly selected new car subsequently fails a test taken on the fifth anniversary of its manufacture? Find also the probability that a car which was destroyed immediately after its fifth anniversary test was rusty when it passed its fourth anniversary test.


Solution: Given the probability density after \(3\) years is proportional to \(\frac{t^2}{(1+t^2)^2}\) then we must have that: \begin{align*} && 1 &= A \int_0^{\infty} \frac{t^2}{(1+t^2)^2} \, \d t \\ &&&= A \left [ -\frac12 \frac{t}{1+t^2} \right]_0^{\infty} + \frac{A}2 \int_0^{\infty} \frac{1}{1+t^2} \d t \\ &&&= \frac{A}{2} \frac{\pi}{2} \\ \Rightarrow && A &= \frac{4}{\pi} \end{align*} In order to fail a test on the fifth anniversary, there are two possibilities for when we went faulty. We could have gone faulty before \(4\) years, got lucky once and then failed the second test, or gone faulty in the next year and then failed the first test. \begin{align*} \P(\text{rusty before } 4 \text{ years}) &=\frac{4}{\pi} \int_0^1 \frac{t^2}{(1+t^2)^2} \d t \\ &= \frac{4}{\pi} \left [ -\frac12 \frac{t}{1+t^2} \right]_0^{1} + \frac{2}{\pi} \int_0^{1} \frac{1}{1+t^2} \d t \\ &= -\frac{1}{\pi} + \frac{2}{\pi} \frac{\pi}{4} \\ &= \frac12 - \frac{1}{\pi} \\ &\approx 0.181690\cdots \\ \\ \P(\text{rusty before } 5 \text{ years}) &=\frac{4}{\pi} \int_0^1 \frac{t^2}{(1+t^2)^2} \d t \\ &= \frac{4}{\pi} \left [ -\frac12 \frac{t}{1+t^2} \right]_0^{2} + \frac{2}{\pi} \int_0^{2} \frac{1}{1+t^2} \d t \\ &= -\frac{4}{5\pi} + \frac{2}{\pi} \tan^{-1} 2 \\ &\approx 0.450184\cdots \\ \end{align*} Therefore: \begin{align*} \P(\text{fails 5th anniversary}) &= \P(\text{rusty before } 4 \text{ years}) \P(\text{pass one, fail other}) + \\ & \quad \quad + \P(\text{rusty between 4 and 5 years}) \P(\text{fail}) \\ &= 0.181690\cdots \cdot \frac{1}{4} + \frac{1}{2} ( 0.450184\cdots- 0.181690\cdots) \\ &= \frac{1}{2} 0.450184\cdots - \frac{1}{4} 0.181690\cdots \\ &= 0.1796688\cdots \\ &= 18.0\%\,\, (3\text{ s.f.}) \end{align*} We also must have that: \begin{align*} \P(\text{rusty at 4 years}|\text{destroyed at 5}) &= \frac{\P(\text{rusty at 4 years and destroyed at 5})}{\P(\text{destroyed at 5})} \\ &= \frac{0.181690\cdots \cdot \frac{1}{4}}{\frac{1}{2} 0.450184\cdots - \frac{1}{4} 0.181690\cdots} \\ &= 0.252811\cdots \\ &= 25.3\%\,\,(3\text{ s.f.}) \end{align*}