Problems

Filters
Clear Filters

17 problems found

2025 Paper 3 Q11
D: 1500.0 B: 1500.0

  1. Let \(\lambda > 0\). The independent random variables \(X_1, X_2, \ldots, X_n\) all have probability density function $$f(t) = \begin{cases} \lambda e^{-\lambda t} & t \geq 0 \\ 0 & t < 0 \end{cases}$$ and cumulative distribution function \(F(x)\). The value of random variable \(Y\) is the largest of the values \(X_1, X_2, \ldots, X_n\). Show that the cumulative distribution function of \(Y\) is given, for \(y \geq 0\), by $$G(y) = (1 - e^{-\lambda y})^n$$
  2. The values \(L(\alpha)\) and \(U(\alpha)\), where \(0 < \alpha \leq \frac{1}{2}\), are such that $$P(Y < L(\alpha)) = \alpha \text{ and } P(Y > U(\alpha)) = \alpha$$ Show that $$L(\alpha) = -\frac{1}{\lambda}\ln(1 - \alpha^{1/n})$$ and write down a similar expression for \(U(\alpha)\).
  3. Use the approximation \(e^t \approx 1 + t\), for \(|t|\) small, to show that, for sufficiently large \(n\), $$\lambda L(\alpha) \approx \ln(n) - \ln\left(\ln\left(\frac{1}{\alpha}\right)\right)$$
  4. Hence show that the median of \(Y\) tends to infinity as \(n\) increases, but that the width of the interval \(U(\alpha) - L(\alpha)\) tends to a value which is independent of \(n\).
  5. You are given that, for \(|t|\) small, \(\ln(1 + t) \approx t\) and that \(e^3 \approx 20\). Show that, for sufficiently large \(n\), there is an interval of width approximately \(4\lambda^{-1}\) in which \(Y\) lies with probability \(0.9\).


Solution:

  1. Note that \(\displaystyle F(y) = \mathbb{P}(X_i < y) = \int_0^y \lambda e^{-\lambda t} \d t = 1-e^{-\lambda y}\). Notice also that \begin{align*} G(y) &= \mathbb{P}(Y < y) \\ &= \mathbb{P}(\max_i(X_i) < y) \\ &= \mathbb{P}(X_i < y \text{ for all }i) \\ &= \prod_{i=1}^n \mathbb{P}(X_i < y) \\ &= \prod_{i=1}^n (1-e^{-\lambda y})\\ &= (1-e^{-\lambda y})^n \end{align*} as required.
  2. \begin{align*} && \mathbb{P}(Y < L(\alpha)) &= \alpha \\ \Rightarrow && (1-e^{-\lambda L(\alpha)})^n &= \alpha \\ \Rightarrow && 1-e^{-\lambda L(\alpha)} &= \alpha^{\tfrac1n} \\ \Rightarrow && L(\alpha) &= -\frac{1}{\lambda}\ln \left (1-\alpha^{\tfrac1n} \right) \end{align*} Notice also: \begin{align*} && \mathbb{P}(Y > U(\alpha)) &= \alpha \\ \Rightarrow && 1 - (1-e^{-\lambda U(\alpha)})^n &= \alpha \\ \Rightarrow && U(\alpha) &= -\frac{1}{\lambda}\ln \left ( 1-(1-\alpha)^{\tfrac1n} \right) \end{align*}
  3. \begin{align*} \lambda L(\alpha) &= -\ln \left (1-\alpha^{\tfrac1n} \right) \\ &= -\ln \left (1-e^{\tfrac1n \ln \alpha} \right) \\ &\approx - \ln \left ( 1 - 1 - \frac1n \ln \alpha\right) \tag{\(e^t \approx 1 + t\)} \\ &= -\ln \left ( \frac{1}{n} \ln \frac{1}\alpha \right) \\ &= - \ln \frac{1}{n} - \ln \left ( \ln \frac{1}{\alpha} \right )\\ &= \ln n - \ln \left ( \ln \left ( \frac{1}{\alpha} \right ) \right) \end{align*} since if \(n\) is large, \(\frac{\ln \alpha}{n}\) is small.
  4. The median is the value where \(\mathbb{P}(Y < M) = \frac12\), or in other words \(L(\frac12)\), but this is \(\approx \frac{\ln n - \ln (\ln 2)}{\lambda} \to \infty\). \begin{align*} && \lambda U(\alpha) &\approx \ln n - \ln \left ( \ln \left ( \frac{1}{1-\alpha} \right ) \right) \\ \Rightarrow && \lambda(U(\alpha) - L(\alpha)) &\approx -\ln \left ( \ln \left ( \frac{1}{1-\alpha} \right ) \right)+ \ln \left ( \ln \left ( \frac{1}{\alpha} \right ) \right) \\ \Rightarrow && U(\alpha) - L(\alpha) &\to \frac{1}{\lambda} \left ( \ln \left ( \ln \left ( \frac{1}{\alpha} \right ) \right)-\ln \left ( \ln \left ( \frac{1}{1-\alpha} \right ) \right ) \right) \end{align*} which doesn't depend on \(n\).
  5. Suppose \(\alpha = \frac{1}{20}\) then \begin{align*} U(\alpha) - L(\alpha) &\approx \frac{1}{\lambda} \left (\ln \ln 20 - \ln \ln \frac{20}{19} \right) \\ &= \lambda^{-1} \left (\ln \ln 20 - \ln \ln (1 + \frac{1}{19}) \right) \\ &\approx \lambda^{-1} \left (\ln 3 - \ln \frac{1}{19} \right) \tag{\(\ln(1+t) \approx t\)} \\ &\approx \lambda^{-1} \ln 3 \cdot 19 \\ &\approx \lambda^{-1} (1 + 3) \\ &\approx 4\lambda^{-1} \end{align*} [Note that \(\ln \ln 20 - \ln \ln \frac{20}{19} = 4.0673\ldots\)]

2021 Paper 3 Q11
D: 1500.0 B: 1500.0

The continuous random variable \(X\) has probability density function \[ f(x) = \begin{cases} \lambda e^{-\lambda x} & \text{for } x \geqslant 0, \\ 0 & \text{otherwise,} \end{cases} \] where \(\lambda\) is a positive constant. The random variable \(Y\) is the greatest integer less than or equal to \(X\), and \(Z = X - Y\).

  1. Show that, for any non-negative integer \(n\), \[ \mathrm{P}(Y = n) = (1 - e^{-\lambda})\,e^{-n\lambda}. \]
  2. Show that \[ \mathrm{P}(Z < z) = \frac{1 - e^{-\lambda z}}{1 - e^{-\lambda}} \qquad \text{for } 0 \leqslant z \leqslant 1. \]
  3. Evaluate \(\mathrm{E}(Z)\).
  4. Obtain an expression for \[ \mathrm{P}(Y = n \text{ and } z_1 < Z < z_2), \] where \(0 \leqslant z_1 < z_2 \leqslant 1\) and \(n\) is a non-negative integer. Determine whether \(Y\) and \(Z\) are independent.


Solution:

  1. \(\,\) \begin{align*} && \mathbb{P}(Y = n) &= \mathbb{P}(X \in [n, n+1)) \\ &&&= \int_n^{n+1} \lambda e^{-\lambda x} \d x \\ &&&= \left [-e^{-\lambda x} \right]_n^{n+1} \\ &&&= e^{-\lambda n} - e^{-\lambda(n+1)} \\ &&&= e^{-\lambda n}(1- e^{-\lambda}) \end{align*}
  2. \(,\) \begin{align*} && \mathbb{P}(Z < z) &= \sum_{i=0}^{\infty} \mathbb{P}(X \in (n, n+z)) \\ &&&= \sum_{i=0}^{\infty} \int_{n}^{n+z} \lambda e^{-\lambda x} \d x \\ &&&= \sum_{i=0}^{\infty} [-e^{-\lambda x}]_{n}^{n+z} \\ &&&= \sum_{i=0}^{\infty} (1-e^{-\lambda x})e^{-\lambda n} \\ &&&= \frac{1-e^{-\lambda x}}{1-e^{-\lambda}} \end{align*}
  3. Give the cdf of \(Z\), we see that \(f_Z(z) = \frac{\lambda e^{-\lambda z}}{1-e^{-\lambda}}\) so \begin{align*} && \E[Z] &= \int_0^1 z \frac{\lambda e^{-\lambda z}}{1-e^{-\lambda}} \d z \\ &&&= \frac{\lambda}{1-e^{-\lambda}} \int_0^1 ze^{-\lambda z} \d z \\ &&&= \frac{\lambda}{1-e^{-\lambda}} \left ( \left [-\frac{1}{\lambda} ze^{-\lambda z} \right]_0^1+\int_0^1 \frac{1}{\lambda} e^{-\lambda z} \d z \right) \\ &&&= \frac{\lambda}{1-e^{-\lambda}} \left ( -\frac{e^{-\lambda}}{\lambda} + \frac{1-e^{-\lambda}}{\lambda^2} \right) \\ &&&= \frac{1-e^{-\lambda}(1+\lambda)}{\lambda (1-e^{-\lambda})} \end{align*}
  4. \(\,\) \begin{align*} && \mathbb{P}(Y = n \text{ and }z_1 < Z < z_2)&= \mathbb{P}(X \in (n+z_1, n+z_2) ) \\ &&&= \int_{n+z_1}^{n+z_2} \lambda e^{-\lambda x} \d x \\ &&&= e^{-n\lambda}(e^{-\lambda z_1} - e^{-\lambda z_2}) \end{align*} Note that \(\mathbb{P}(z_1 < Z < z_2) = \mathbb{P}( Z < z_2) -\mathbb{P}(Z< z_1) =\frac{e^{-\lambda z_1} - e^{-\lambda z_2}}{1-e^{-\lambda}}\) Therefore \begin{align*} && \mathbb{P}(Y = n \text{ and }z_1 < Z < z_2) &= e^{-n\lambda}(e^{-\lambda z_1} - e^{-\lambda z_2}) \\ &&&= e^{-\lambda n}(1-e^{-\lambda}) \frac{e^{-\lambda z_1} - e^{-\lambda z_2}}{1-e^{-\lambda}} \\ &&&= \mathbb{P}(Y=n) \mathbb{P}(z_1 < Z < z_2) \end{align*} So they are independent, which is to be expected from the memorylessness property of the exponential distribution.

2017 Paper 2 Q12
D: 1600.0 B: 1563.6

Adam and Eve are catching fish. The number of fish, \(X\), that Adam catches in any time interval is Poisson distributed with parameter \(\lambda t\), where \(\lambda\) is a constant and \(t\) is the length of the time interval. The number of fish, \(Y\), that Eve catches in any time interval is Poisson distributed with parameter \(\mu t\), where \(\mu\) is a constant and \(t\) is the length of the time interval The two Poisson variables are independent. You may assume that the expected time between Adam catching a fish and Adam catching his next fish is \(\lambda^{-1}\), and similarly for Eve.

  1. By considering \(\P( X + Y = r)\), show that the total number of fish caught by Adam and Eve in time \(T\) also has a Poisson distribution.
  2. Given that Adam and Eve catch a total of \(k\) fish in time \(T\), where \(k\) is fixed, show that the number caught by Adam has a binomial distribution.
  3. Given that Adam and Eve start fishing at the same time, find the probability that the first fish is caught by Adam.
  4. Find the expected time from the moment Adam and Eve start fishing until they have each caught at least one fish.
[Note This question has been redrafted to make the meaning clearer.]


Solution:

  1. \(\,\) \begin{align*} && \mathbb{P}(X+Y=r) &= \sum_{k=0}^r \mathbb{P}(X = k, Y = r-k) \\ &&&= \sum_{k=0}^r \mathbb{P}(X = k)\mathbb{P}( Y = r-k) \\ &&&= \sum_{k=0}^r \frac{e^{-\lambda T} (\lambda T)^k}{k!}\frac{e^{-\mu T} (\mu T)^{r-k}}{(r-k)!}\\ &&&= \frac{e^{-(\mu+\lambda)T}}{r!}\sum_{k=0}^r \binom{r}{k}(\lambda T)^k (\mu T)^{r-k}\\ &&&= \frac{e^{-(\mu+\lambda)T}((\mu+\lambda)T)^r}{r!} \end{align*} Therefore \(X+Y \sim Po \left ( (\mu+\lambda)T \right)\)
  2. \(\,\) \begin{align*} && \mathbb{P}(X = r | X+Y = k) &= \frac{\mathbb{P}(X=r, Y = k-r)}{\mathbb{P}(X+Y=k)} \\ &&&= \frac{\frac{e^{-\lambda T} (\lambda T)^r}{r!}\frac{e^{-\mu T} (\mu T)^{k-r}}{(k-r)!}}{\frac{e^{-(\mu+\lambda)T}((\mu+\lambda)T)^k}{k!}} \\ &&&= \binom{k}{r} \left ( \frac{\lambda}{\lambda + \mu} \right)^r \left ( \frac{\mu}{\lambda + \mu} \right)^{k-r} \end{align*} Therefore \(X|X+Y=k \sim B(k, \frac{\lambda}{\lambda + \mu})\)
  3. \(P(X=1|X+Y = 1) = \frac{\lambda}{\lambda + \mu}\)
  4. Let \(X_1, Y_1\) be the time to the first fish are caught by Adam and Eve, then \begin{align*} && \mathbb{P}(X_1, Y_1 > t) &= \mathbb{P}(X_1> t) \mathbb{P}( Y_1 > t) \\ &&&= e^{-\lambda t}e^{-\mu t} \\ &&&= e^{-(\lambda+\mu)t} \\ \Rightarrow && f_{\max(X_1,Y_1)}(t) &= (\lambda+\mu)e^{-(\lambda+\mu)} \end{align*} Therefore the expected time is \(\frac1{\mu+\lambda}\)

2016 Paper 1 Q13
D: 1500.0 B: 1500.0

An internet tester sends \(n\) e-mails simultaneously at time \(t=0\). Their arrival times at their destinations are independent random variables each having probability density function \(\lambda \e^{-\lambda t}\) (\(0\le t<\infty\), \( \lambda >0\)).

  1. The random variable \(T\) is the time of arrival of the e-mail that arrives first at its destination. Show that the probability density function of \(T\) is \[ n \lambda \e^{-n\lambda t}\,,\] and find the expected value of \(T\).
  2. Write down the probability that the second e-mail to arrive at its destination arrives later than time \(t\) and hence derive the density function for the time of arrival of the second e-mail. Show that the expected time of arrival of the second e-mail is \[ \frac{1}{\lambda} \left( \frac1{n-1} + \frac 1 n \right) \]


Solution:

  1. \(\,\) \begin{align*} && \mathbb{P}(T > t) &= \mathbb{P}(\text{all emails slower than }t) \\ &&&= \left ( \int_t^{\infty} \lambda e^{-\lambda x} \d x \right)^n \\ &&&= \left ( [- e^{-\lambda x}]_t^\infty\right)^n\\ &&&= e^{-n\lambda t} \\ \Rightarrow && f_T(t) &= n \lambda e^{-n\lambda t} \\ \end{align*} Therefore \(T \sim \text{Exp}(n \lambda)\) and \(\E[T] = \frac{1}{n \lambda}\)
  2. Let \(T_2\) be the time until the second email arrives, then. \begin{align*} && \P(T_2 > t) &= \P(\text{all emails} > t) + \P(\text{all but 1 emails} > t) \\ &&&= e^{-n\lambda t} + n \cdot e^{-(n-1)\lambda t}(1-e^{-\lambda t}) \\ &&&= (1-n)e^{-n\lambda t} + n \cdot e^{-(n-1)\lambda t} \\ \Rightarrow && f_{T_2}(t) &= - \left ( (1-n) n \lambda e^{-n \lambda t} -n(n-1)\lambda e^{-(n-1)\lambda t} \right) \\ &&&= n(n-1) \lambda \left (e^{-(n-1)\lambda t} - e^{-n\lambda t} \right) \\ \Rightarrow && \E[T_2] &= \int_0^{\infty} t \cdot n(n-1) \lambda \left (e^{-(n-1)\lambda t} - e^{-n\lambda t} \right) \d t \\ &&&= \int_0^{\infty} \left (n \cdot t (n-1) \lambda e^{-(n-1)\lambda t} -(n-1)\cdot tn \lambda e^{-n\lambda t} \right) \d t \\ &&&= \frac{n}{\lambda(n-1)} - \frac{n-1}{\lambda n} \\ &&&= \frac{1}{\lambda} \left (1+\frac{1}{n-1}- \left (1 - \frac{1}{n} \right) \right) \\ &&&= \frac{1}{\lambda} \left ( \frac{1}{n-1} + \frac{1}{n} \right) \end{align*} (We can also view this second expectation as expected time for first email + expected time (of the remaining \(n-1\) emails) for the first email, and we can see that will have that form by the memorilessness property of exponentials)

2015 Paper 2 Q13
D: 1600.0 B: 1516.0

The maximum height \(X\) of flood water each year on a certain river is a random variable with probability density function \(\f\) given by \[ \f(x) = \begin{cases} \lambda \e^{-\lambda x} & \text{for \(x\ge0\)}\,, \\ 0 & \text{otherwise,} \end{cases} \] where \(\lambda\) is a positive constant. It costs \(ky\) pounds each year to prepare for flood water of height \(y\) or less, where \(k\) is a positive constant and \(y\ge0\). If \(X \le y\) no further costs are incurred but if \(X> y\) the additional cost of flood damage is \(a(X - y )\) pounds where \(a\) is a positive constant.

  1. Let \(C\) be the total cost of dealing with the floods in the year. Show that the expectation of \(C\) is given by \[\mathrm{E}(C)=ky+\frac{a}{\lambda}\mathrm{e}^{-\lambda y} \, . \] How should \(y\) be chosen in order to minimise \(\mathrm{E}(C)\), in the different cases that arise according to the value of \(a/k\)?
  2. Find the variance of \(C\), and show that the more that is spent on preparing for flood water in advance the smaller this variance.


Solution:

  1. \(\,\) \begin{align*} && \mathbb{E}(C) &= \int_0^\infty \text{cost}(x) f(x) \d x \\ &&&= ky + \int_y^{\infty} a(x-y) \lambda e^{-\lambda x} \d x\\ &&&= ky + \int_0^{\infty} a u \lambda e^{-\lambda u -\lambda y} \d x \\ &&&= ky + ae^{-\lambda y} \left( \left [ -ue^{-\lambda u} \right]_0^\infty -\int_0^\infty e^{-\lambda u} \d u\right) \\ &&&= ky + \frac{a}{\lambda}e^{-\lambda y} \\ \\ && \frac{\d \mathbb{E}(C)}{\d y} &= k - ae^{-\lambda y} \\ \Rightarrow && y &= \frac{1}{\lambda}\ln \left ( \frac{a}{k} \right) \end{align*} Since \(\mathbb{E}(C)\) is clearly increasing when \(y\) is very large, the optimal value will be \(\frac{1}{\lambda}\ln \left ( \frac{a}{k} \right)\), if \(\frac{a}{k} > 1\), otherwise you should spend nothing on flood defenses.
  2. \begin{align*} && \mathbb{E}(C^2) &= \int_0^{\infty} \text{cost}(x)^2 f(x) \d x \\ &&&= \int_0^{\infty}(ky + a(x-y)\mathbb{1}_{x > y})^2 f(x) \d x \\ &&&= k^2y^2 + \int_y^{\infty}2kya(x-y)f(x)\d x + \int_y^{\infty}a^2 (x-y)^2 f(x) \d x \\ &&&= k^2y^2 + \frac{2kya}{\lambda}e^{- \lambda y}+a^2e^{-\lambda y}\int_{u=0}^\infty u^2 \lambda e^{-\lambda u} \d u \\ &&&= k^2y^2 + \frac{2kya}{\lambda}e^{-\lambda y}+a^2e^{-\lambda y}(\textrm{Var}(Exp(\lambda)) + \mathbb{E}(Exp(\lambda))^2\\ &&&= k^2y^2 + \frac{2kya}{\lambda}e^{-\lambda y} + a^2e^{-\lambda y} \frac{2}{\lambda^2} \\ && \textrm{Var}(C) &= k^2y^2 + \frac{2kya}{\lambda}e^{-\lambda y} + a^2e^{-\lambda y} \frac{2}{\lambda^2} - \left ( ky + \frac{a}{\lambda} e^{-\lambda y}\right)^2 \\ &&&= a^2e^{-\lambda y} \frac{2}{\lambda^2} - a^2 e^{-2\lambda y}\frac{1}{\lambda^2} \\ &&&= \frac{a^2}{\lambda^2} e^{-\lambda y}\left (2 - e^{-\lambda y} \right) \\ \\ && \frac{\d \textrm{Var}(C)}{\d y} &= \frac{a^2}{\lambda^2} \left (-2\lambda e^{-\lambda y} +2\lambda e^{-2\lambda y} \right) \\ &&&= \frac{2a^2}{\lambda} e^{-\lambda y}\left (e^{-\lambda y}-1 \right) \leq 0 \end{align*} so \(\textrm{Var}(C)\) is decreasing in \(y\).

2014 Paper 2 Q12
D: 1600.0 B: 1484.8

The lifetime of a fly (measured in hours) is given by the continuous random variable \(T\) with probability density function \(f(t)\) and cumulative distribution function \(F(t)\). The hazard function, \(h(t)\), is defined, for \(F(t) < 1\), by \[ h(t) = \frac{f(t)}{1-F(t)}\,. \]

  1. Given that the fly lives to at least time \(t\), show that the probability of its dying within the following \(\delta t\) is approximately \(h (t) \, \delta t\) for small values of \(\delta t\).
  2. Find the hazard function in the case \(F(t) = t/a\) for \(0< t < a\). Sketch \(f(t)\) and \(h(t)\) in this case.
  3. The random variable \(T\) is distributed on the interval \(t > a\), where \(a>0\), and its hazard function is \(t^{-1}\). Determine the probability density function for \(T\).
  4. Show that \(h(t)\) is constant for \(t > b\) and zero otherwise if and only if \(f(t) =ke^{-k(t-b)}\) for \(t > b\), where \(k\) is a positive constant.
  5. The random variable \(T\) is distributed on the interval \(t > 0\) and its hazard function is given by \[ h(t) = \left(\frac{\lambda}{\theta^\lambda}\right)t^{\lambda-1}\,, \] where \(\lambda\) and \(\theta\) are positive constants. Find the probability density function for \(T\).


Solution:

  1. \(\,\) \begin{align*} && \mathbb{P}(T > t + \delta t | T > t) &= \frac{\mathbb{P}(T < t + \delta t)}{\mathbb{P}(T > t )} \\ &&&= \frac{\int_t^{t+\delta t} f(s) \d s}{1-F(t)} \\ &&&\approx \frac{f(t)\delta t}{1-F(t)} \\ &&&= h(t) \delta t \end{align*}
  2. If \(F(t) = t/a\) then \(f(t) = 1/a\) and \(h(t) = \frac{1/a}{1-t/a} = \frac{1}{a-t}\)
    TikZ diagram
  3. \(\,\) \begin{align*} && \frac{F'}{1-F} &= \frac{1}{t} \\ \Rightarrow && -\ln (1-F) &= \ln t + C\\ \Rightarrow && 1-F &= \frac{A}{t} \\ && F &= 1 - \frac{A}{t} \\ F(a) = 0: && F &= 1 - \frac{a}{t} \\ && f(t) &= \frac{a}{t^2} \end{align*}
  4. (\(\Rightarrow\)) \begin{align*} && \frac{F'}{1-F} &= k \\ \Rightarrow && -\ln(1-F) &= kt+C \\ \Rightarrow && 1-F &= Ae^{-kt} \\ F(b) = 0: && 1 &= Ae^{-kb} \\ \Rightarrow && 1-F &= e^{-k(t-b)}\\ \Rightarrow && f &= ke^{-k(t-b)} \\ \end{align*} (\(\Leftarrow\)) \(f(t) = ke^{-k(t-b)} \Rightarrow F(t) = 1-e^{-k(t-b)}\) and the result is clear.
  5. \(\,\) \begin{align*} && \frac{F'}{1-F} &= \left ( \frac{\lambda}{\theta^{\lambda}} \right) t^{\lambda-1} \\ \Rightarrow && -\ln(1-F) &= \left ( \frac{t}{\theta} \right)^{\lambda} +C\\ \Rightarrow && F &= 1-A\exp \left (- \left ( \frac{t}{\theta} \right)^{\lambda} \right) \\ F(0) = 0: && 0 &= 1-A \\ \Rightarrow && F &= 1 - \exp \left (- \left ( \frac{t}{\theta} \right)^{\lambda} \right) \\ \Rightarrow && f &= \lambda t^{\lambda -1} \theta^{-\lambda} \exp \left (- \left ( \frac{t}{\theta} \right)^{\lambda} \right) \end{align*}

2005 Paper 1 Q14
D: 1516.0 B: 1513.9

The random variable \(X\) can take the value \(X=-1\), and also any value in the range \(0\le X <\infty\,\). The distribution of \(X\) is given by \[ \P(X=-1) =m \,, \ \ \ \ \ \ \ \P(0\le X\le x) = k(1-\e^{-x})\,, \] for any non-negative number \(x\), where \(k\) and \(m\) are constants, and \(m <\frac12\,\).

  1. Find \(k\) in terms of \(m\).
  2. Show that \(\E(X)= 1-2m\,\).
  3. Find, in terms of \(m\), \(\var (X)\) and the median value of \(X\).
  4. Given that \[ \int_0^\infty y^2 \e^{-y^2} \d y = \tfrac14 \sqrt{ \pi}\;,\] find \(\E\big(\vert X \vert^{\frac12}\big)\,\) in terms of \(m\).


Solution:

  1. We must have the total probability summing to \(1\), therefore \(1 =m + k\) (as \(x \to \infty\)) therefore \(k = 1-m\).
  2. \(\,\) \begin{align*} && \E[X] &= \mathbb{P}(X=-1) \cdot (-1) + \int_0^{\infty} kx e^{-x} \d x \\ &&&= -m + (1-m) = 1-2m \end{align*}
  3. \(\,\) \begin{align*} && \var[X] &= \E[X^2]-\E[X]^2 \\ &&&= \mathbb{P}(X=-1)\cdot(-1)^2 + (1-m)\int_0^{\infty} x^2e^{-x} \d x - (1-2m)^2 \\ &&&= m + (1-m)(1+1^2) - (1-2m)^2 \\ &&&= 3-4m - 1+4m -4m^2 \\ &&&= 2(1-m^2) \end{align*} To find the median \(q\), we need \begin{align*} && \frac12 &= \mathbb{P}(X \leq q) \\ &&&= m + (1-m)(1-e^{-q}) \\ \Rightarrow && e^{-q} &= 1-\frac{\frac12-m}{1-m} \\ &&&= \frac{1-m - \frac12+m}{1-m} \\ &&&= \frac{1}{2(1-m)} \\ \Rightarrow && q &= \ln 2(1-m) \end{align*}
  4. \(\,\) \begin{align*} && \E\left [|X|^{\frac12}\right] &= \mathbb{P}(X=-1) \cdot 1 + \int_0^{\infty} \sqrt{x} (1-m)e^{-x} \d x \\ &&&= m + (1-m)\int_0^\infty \sqrt{x} e^{-x} \d x \\ u^2 = x, \d x = 2u \d u : &&&= m + (1-m) \int_{u=0}^{u=\infty} u e^{-u^2} \cdot 2u \d u \\ &&&= m + 2(1-m) \int_0^{\infty} u^2 e^{-u^2} \d u \\ &&&= m + (1-m)\frac{\sqrt{\pi}}2 \end{align*}

2003 Paper 3 Q12
D: 1700.0 B: 1470.9

Brief interruptions to my work occur on average every ten minutes and the number of interruptions in any given time period has a Poisson distribution. Given that an interruption has just occurred, find the probability that I will have less than \(t\) minutes to work before the next interruption. If the random variable \(T\) is the time I have to work before the next interruption, find the probability density function of \(T\,\). I need an uninterrupted half hour to finish an important paper. Show that the expected number of interruptions before my first uninterrupted period of half an hour or more is \(\e^3-1\). Find also the expected length of time between interruptions that are less than half an hour apart. Hence write down the expected wait before my first uninterrupted period of half an hour or more.

2002 Paper 1 Q14
D: 1500.0 B: 1516.0

In order to get money from a cash dispenser I have to punch in an identification number. I have forgotten my identification number, but I do know that it is equally likely to be any one of the integers \(1\), \(2\), \ldots , \(n\). I plan to punch in integers in order until I get the right one. I can do this at the rate of \(r\) integers per minute. As soon as I punch in the first wrong number, the police will be alerted. The probability that they will arrive within a time \(t\) minutes is \(1-\e^{-\lambda t}\), where \(\lambda\) is a positive constant. If I follow my plan, show that the probability of the police arriving before I get my money is \[ \sum_{k=1}^n \frac{1-\e^{-\lambda(k-1)/r}}n\;. \] Simplify the sum. On past experience, I know that I will be so flustered that I will just punch in possible integers at random, without noticing which I have already tried. Show that the probability of the police arriving before I get my money is \[ 1-\frac1{n-(n-1)\e^{-\lambda/r}} \;. \]

2002 Paper 3 Q13
D: 1700.0 B: 1516.0

A continuous random variable is said to have an exponential distribution with parameter \(\lambda\) if its density function is \(\f(t) = \lambda \e ^{- \lambda t} \; \l 0 \le t < \infty \r\,\). If \(X_1\) and \(X_2\), which are independent random variables, have exponential distributions with parameters \(\lambda_1\) and \(\lambda_2\) respectively, find an expression for the probability that either \(X_1\) or \(X_2\) (or both) is less than \(x\). Prove that if \(X\) is the random variable whose value is the lesser of the values of \(X_1\) and \(X_2\), then \(X\) also has an exponential distribution. Route A and Route B buses run from my house to my college. The time between buses on each route has an exponential distribution and the mean time between buses is 15 minutes for Route A and 30 minutes for Route B. The timings of the buses on the two routes are independent. If I emerge from my house one day to see a Route A bus and a Route B bus just leaving the stop, show that the median wait for the next bus to my college will be approximately 7 minutes.

1998 Paper 2 Q13
D: 1600.0 B: 1516.0

A random variable \(X\) has the probability density function \[ \mathrm{f}(x)=\begin{cases} \lambda\mathrm{e}^{-\lambda x} & x\geqslant0,\\ 0 & x<0. \end{cases} \] Show that $${\rm P}(X>s+t\,\vert X>t) = {\rm P}(X>s).$$ The time it takes an assistant to serve a customer in a certain shop is a random variable with the above distribution and the times for different customers are independent. If, when I enter the shop, the only two assistants are serving one customer each, what is the probability that these customers are both still being served at time \(t\) after I arrive? One of the assistants finishes serving his customer and immediately starts serving me. What is the probability that I am still being served when the other customer has finished being served?


Solution: \begin{align*} && \mathbb{P}(X > t) &= \int_t^{\infty} \lambda e^{-\lambda x} \d x\\ &&&= \left[ -e^{-\lambda x} \right]_t^\infty \\ &&&= e^{-\lambda t}\\ \\ && \mathbb{P}(X > s + t | X > t) &= \frac{\mathbb{P}(X > s + t)}{\mathbb{P}(X > t)} \\ &&&= \frac{e^{-(s+t)\lambda}}{e^{-t\lambda}} \\ &&&= e^{-s\lambda} = \mathbb{P}(X > s) \end{align*} The probability both are still being served (independently) is \(\mathbb{P}(X > t)^2 = e^{-2\lambda t}\). The probability is exactly \(\frac12\). The property we proved in the first part of the questions shows the distribution is memoryless, ie we are both experiencing samples from the same distribution. Therefore we are equally likely to finish first.

1997 Paper 1 Q14
D: 1484.0 B: 1484.0

The maximum height \(X\) of flood water each year on a certain river is a random variable with density function \begin{equation*} {\mathrm f}(x)= \begin{cases} \exp(-x)&\text{if \(x\geqslant 0\),}\\ 0&\text{otherwise}. \end{cases} \end{equation*} It costs \(y\) megadollars each year to prepare for flood water of height \(y\) or less. If \(X\leqslant y\) no further costs are incurred but if \(X\geqslant y\) the cost of flood damage is \(r+s(X-y)\) megadollars where \(r,s>0\). The total cost \(T\) megadollars is thus given by \begin{equation*} T= \begin{cases} y&\text{if \(X\leqslant y\)},\\ y+r+s(X-y)&\text{if \(X>y\)}. \end{cases} \end{equation*} Show that we can minimise the expected total cost by taking \[y=\ln(r+s).\]

1996 Paper 2 Q12
D: 1600.0 B: 1500.0

  1. Let \(X_{1}, X_{2}, \dots, X_{n}\) be independent random variables each of which is uniformly distributed on \([0,1]\). Let \(Y\) be the largest of \(X_{1}, X_{2}, \dots, X_{n}\). By using the fact that \(Y<\lambda\) if and only if \(X_{j}<\lambda\) for \(1\leqslant j\leqslant n\), find the probability density function of \(Y\). Show that the variance of \(Y\) is \[\frac{n}{(n+2)(n+1)^{2}}.\]
  2. The probability that a neon light switched on at time \(0\) will have failed by a time \(t>0\) is \(1-\mathrm{e}^{-t/\lambda}\) where \(\lambda>0\). I switch on \(n\) independent neon lights at time zero. Show that the expected time until the first failure is \(\lambda/n\).


Solution:

  1. \(\,\) \begin{align*} && F_Y(\lambda) &= \mathbb{P}(Y < \lambda) \\ &&&= \prod_i \mathbb{P}(X_i < \lambda) \\ &&&= \lambda^n \\ \Rightarrow && f_Y(\lambda) &= \begin{cases} n \lambda^{n-1} & \text{if } 0 \leq \lambda \leq 1 \\ 0 & \text{otherwise} \end{cases} \\ \\ && \E[Y] &= \int_0^1 \lambda f_Y(\lambda) \d \lambda \\ &&&= \int_0^1 n \lambda^n \d \lambda \\ &&&= \frac{n}{n+1} \\ && \E[Y^2] &= \int_0^1 \lambda^2 f_Y(\lambda) \d \lambda \\ &&&= \int_0^1 n \lambda^{n+1} \d \lambda \\ &&&= \frac{n}{n+2} \\ \Rightarrow && \var[Y] &= \E[Y^2]-(\E[Y])^2 \\ &&&= \frac{n}{n+2} - \frac{n^2}{(n+1)^2} \\ &&&= \frac{(n+1)^2n-n^2(n+2)}{(n+2)(n+1)^2} \\ &&&= \frac{n[(n^2+2n+1)-(n^2+2n)]}{(n+2)(n+1)^2} \\ &&&= \frac{n}{(n+2)(n+1)^2} \end{align*}
  2. Using the same reasoning, we can see that \begin{align*} && 1-F_Z(t) &= \mathbb{P}(\text{all lights still on after t}) \\ &&&= \prod_i e^{-t/\lambda} \\ &&&= e^{-nt/\lambda} \\ \\ \Rightarrow && F_Z(t) &= 1-e^{-nt/\lambda} \end{align*} Therefore \(Z \sim Exp(\frac{n}{\lambda})\) and the time to first failure is \(\lambda/n\)

1996 Paper 3 Q14
D: 1700.0 B: 1484.0

Whenever I go cycling I start with my bike in good working order. However if all is well at time \(t\), the probability that I get a puncture in the small interval \((t,t+\delta t)\) is \(\alpha\,\delta t.\) How many punctures can I expect to get on a journey during which my total cycling time is \(T\)? When I get a puncture I stop immediately to repair it and the probability that, if I am repairing it at time \(t\), the repair will be completed in time \((t,t+\delta t)\) is \(\beta\,\delta t.\) If \(p(t)\) is the probability that I am repairing a puncture at time \(t\), write down an equation relating \(p(t)\) to \(p(t+\delta t)\), and derive from this a differential equation relating \(p'(t)\) and \(p(t).\) Show that \[ p(t)=\frac{\alpha}{\alpha+\beta}(1-\mathrm{e}^{-(\alpha+\beta)t}) \] satisfies this differential equation with the appropriate initial condition. Find an expression, involving \(\alpha,\beta\) and \(T\), for the time expected to be spent mending punctures during a journey of total time \(T\). Hence, or otherwise, show that, the fraction of the journey expected to be spent mending punctures is given approximately by \[ \quad\frac{\alpha T}{2}\quad\ \mbox{ if }(\alpha+\beta)T\text{ is small, } \] and by \[ \frac{\alpha}{\alpha+\beta}\quad\mbox{ if }(\alpha+\beta)T\text{ is large.} \]

1994 Paper 1 Q14
D: 1500.0 B: 1532.7

Each of my \(n\) students has to hand in an essay to me. Let \(T_{i}\) be the time at which the \(i\)th essay is handed in and suppose that \(T_{1},T_{2},\ldots,T_{n}\) are independent, each with probability density function \(\lambda\mathrm{e}^{-\lambda t}\) (\(t\geqslant0\)). Let \(T\) be the time I receive the first essay to be handed in and let \(U\) be the time I receive the last one.

  1. Find the mean and variance of \(T_{i}.\)
  2. Show that \(\mathrm{P}(U\leqslant u)=(1-\mathrm{e}^{-\lambda u})^{n}\) for \(u\geqslant0,\) and hence find the probability density function of \(U\).
  3. Obtain \(\mathrm{P}(T>t),\) and hence find the probability density function of \(T\).
  4. Write down the mean and variance of \(T\).


Solution:

  1. \(T_i \sim \textrm{Exp}(\lambda)\) so \(\E[T_i] = \lambda^{-1}, \var[T_i] = \lambda^{-2}\)
  2. \(\,\) \begin{align*} && \mathbb{P}(U \leq u) &= \mathbb{P}(T_i \leq u\quad \forall i) \\ &&&= \prod \mathbb{P}(T_i \leq u) \\ &&&= \prod \int_0^u \lambda e^{-\lambda t} \d t \\ &&&= (1-e^{-\lambda u})^n \\ \\ \Rightarrow && f_U(u) &= n\lambda e^{-\lambda u}(1-e^{-\lambda u})^{n-1} \end{align*}
  3. \(\,\) \begin{align*} && \mathbb{P}(T > t) &= \mathbb{P}(T_i > t \quad \forall i) \\ &&&= \prod \mathbb{P}(T_i > t) \\ &&&= e^{-n\lambda t} \\ \Rightarrow && f_T(t) &= n\lambda e^{-n\lambda t} \end{align*}
  4. Therefore \(\E[T] = \frac{1}{n\lambda}, \var[T] = \frac{1}{(n\lambda)^2}\)