Problems

Filters
Clear Filters

8 problems found

2016 Paper 3 Q12
D: 1700.0 B: 1516.0

Let \(X\) be a random variable with mean \(\mu\) and standard deviation \(\sigma\). Chebyshev's inequality, which you may use without proof, is \[ \P\left(\vert X-\mu\vert > k\sigma\right) \le \frac 1 {k^2} \,, \] where \(k\) is any positive number.

  1. The probability of a biased coin landing heads up is \(0.2\). It is thrown \(100n\) times, where \(n\) is an integer greater than 1. Let \(\alpha \) be the probability that the coin lands heads up \(N\) times, where \(16n \le N \le 24n\). Use Chebyshev's inequality to show that \[ \alpha \ge 1-\frac 1n \,. \]
  2. Use Chebyshev's inequality to show that \[ 1+ n + \frac{n^2}{ 2!} + \cdots + \frac {n^{2n}}{(2n)!} \ge \left(1-\frac1n\right) \e^n \,. \]


Solution:

  1. Let \(N\) be the number of times the coin lands heads up, ie \(N \sim Binomial(100n, 0.2)\), then \(\mathbb{E}(N) = \mu = 20n, \mathrm{Var}(N) = \sigma^2 = 100n \cdot 0.2 \cdot 0.8 = 16n \Rightarrow \sigma = 4\sqrt{n}\). \begin{align*} && \mathbb{P}(|X - \mu| > k\sigma) &\leq \frac{1}{k^2} \\ \Rightarrow && 1 - \mathbb{P}(|X - \mu| \leq k\sigma) &\leq \frac1{k^2} \\ \Rightarrow && 1 - \mathbb{P}(|X - 20n| \leq \sqrt{n} \cdot 4\sqrt{n}) &\leq \frac1{{\sqrt{n}}^2} \\ \Rightarrow && 1 - \mathbb{P}(16n \leq N \leq 24n) &\leq \frac{1}{n} \\ \Rightarrow && 1 - \frac1n &\leq \alpha \end{align*}
  2. Suppose \(X \sim Pois(n)\), then \(\mathbb{E}(X) = n, \mathrm{Var}(X) = n\). Therefore \begin{align*} && \mathbb{P}(|X - \mu| > k\sigma) &\leq \frac{1}{k^2} \\ \Rightarrow && 1-\mathbb{P}(|X - n| \leq \sqrt{n} \cdot \sqrt{n}) &> \frac{1}{\sqrt{n}^2} \\ \Rightarrow && 1 - \sum_{i=0}^{2n} \mathbb{P}(X = i) & \leq \frac{1}{n} \\ \Rightarrow && \sum_{i=0}^{2n} e^{-n} \frac{n^i}{i!} \geq 1 - \frac{1}{n} \\ \Rightarrow && \sum_{i=0}^{2n} \frac{n^i}{i!} \geq \left ( 1 - \frac1n \right)e^n \end{align*}

2010 Paper 3 Q1
D: 1700.0 B: 1500.8

Let \(x_{\low1}\), \(x_{\low2}\), \ldots, \(x_n\) and \(x_{\vphantom {\dot A} n+1}\) be any fixed real numbers. The numbers \(A\) and \(B\) are defined by \[ A = \frac 1 n \sum_{k=1}^n x_{ \low k} \,, \ \ \ B= \frac 1 n \sum_{k=1}^n (x_{\low k}-A)^2 \,, \ \ \ \] and the numbers \(C\) and \(D\) are defined by \[ C = \frac 1 {n+1} \sum\limits_{k=1}^{n+1} x_{\low k} \,, \ \ \ D = \frac1{n+1} \sum_{k=1}^{n+1} (x_{\low k}-C)^2 \,. \]

  1. Express \( C\) in terms of \(A\), \(x_{\low n+1}\) and \(n\).
  2. Show that $ \displaystyle B= \frac 1 n \sum_{k=1}^n x_{\low k}^2 - A^2\,\(.
  3. Express \)D \( in terms of \)B\(, \)A\(, \)x_{\low n+1}\( and \)n$. Hence show that \((n + 1)D \ge nB\) for all values of \(x_{\low n+1}\), but that \(D < B\) if and only if \[ A-\sqrt{\frac{(n+1)B}{n}} < x_{\low n+1} < A+\sqrt{\frac{(n+1)B}{n}}\,. \]

2005 Paper 2 Q14
D: 1600.0 B: 1469.5

The probability density function \(\f(x)\) of the random variable \(X\) is given by $$\f(x) = k\left[{\phi}(x) + {\lambda}\g(x)\right]$$ where \({\phi}(x)\) is the probability density function of a normal variate with mean 0 and variance 1, \(\lambda \) is a positive constant, and \(\g(x)\) is a probability density function defined by \[ \g(x)= \begin{cases} 1/\lambda & \mbox{for \(0 \le x \le {\lambda}\)}\,;\\ 0& \mbox{otherwise} . \end{cases} \] Find \(\mu\), the mean of \(X\), in terms of \(\lambda\), and prove that \(\sigma\), the standard deviation of \(X\), satisfies. $$\sigma^2 = \frac{\lambda^4 +4{\lambda}^3+12{\lambda}+12} {12(1 + \lambda )^2}\;.$$ In the case \(\lambda=2\):

  1. draw a sketch of the curve \(y=\f(x)\);
  2. express the cumulative distribution function of \(X\) in terms of \(\Phi(x)\), the cumulative distribution function corresponding to \(\phi(x)\);
  3. evaluate \(\P(0 < X < \mu+2\sigma)\), given that \(\Phi (\frac 23 + \frac23 \surd7)=0.9921\).


Solution: \begin{align*} && 1 &= \int_{-\infty}^{\infty} f(x) \d x \\ &&&= k[1 + \lambda] \\ \Rightarrow && k &= \frac{1}{1+\lambda} \\ \\ && \mu &= \int_{-\infty}^\infty x f(x) \d x \\ &&&= k \int_{-\infty}^\infty x \phi(x) \d x + k \lambda \int_{-\infty}^{\infty} x g(x) \d x \\ &&&= k \cdot 0 + k \lambda \cdot \frac{\lambda}{2} \\ &&&= \frac{\lambda^2}{2(1+\lambda)} \\ \\ && \E[X^2] &= \int_{-\infty}^\infty x^2 f(x) \d x \\ &&&= k \int_{-\infty}^\infty x^2 \phi(x) \d x + k \lambda \int_{-\infty}^{\infty} x^2 g(x) \d x \\ &&&= k \cdot 1 + k \lambda \int_0^{\lambda} \frac{x^2}{\lambda} \d \lambda \\ &&&= k + \frac{k \lambda^3}{3} \\ &&&= \frac{3+\lambda^3}{3(1+\lambda)} \\ && \var[X] &= \frac{3+\lambda^3}{3(1+\lambda)} - \frac{\lambda^4}{4(1+\lambda)^2} \\ &&& = \frac{(3+\lambda^3)4(1+\lambda) - 3\lambda^4}{12(1+\lambda)^2} \\ &&&= \frac{\lambda^4+4\lambda^3+12\lambda + 12}{12(1+\lambda)^2} \end{align*}

  1. \(\,\)
    TikZ diagram
  2. \(\,\) \begin{align*} && \mathbb{P}(X \leq x) &= \int_{-\infty}^x f(x) \d x \\ &&&= \begin{cases} \frac13 \Phi(x) & \text{if } x < 0 \\ \frac13\Phi(x) + \frac13x & \text{if } 0 \leq x \leq 2 \\ \frac13 \Phi(x) + \frac23 & \text{if } 2 < x \end{cases} \end{align*} When \(\lambda = 2\), \(\mu = \frac{4}{6} = \frac23\), \(\sigma^2 = \frac{16+32+24+12}{12 \cdot 9} = \frac{7}{9}\), so \(\mu + 2 \sigma = \frac23 + \frac{2\sqrt7}{3}>2\). Therefore \begin{align*} && \P(0 < X < \mu + 2\sigma) &= \frac13 \Phi\left (\frac{2+2\sqrt{7}}{3} \right) + \frac23 - \Phi(0) \\ &&&= \tfrac13 \cdot 0.9921 +\tfrac23 - \tfrac12 \\ &&&= 0.4974 \end{align*}

2003 Paper 2 Q13
D: 1600.0 B: 1469.5

The random variable \(X\) takes the values \(k=1\), \(2\), \(3\), \(\dotsc\), and has probability distribution $$ \P(X=k)= A{{{\lambda}^k\e^{-{\lambda}}} \over {k!}}\,, $$ where \(\lambda \) is a positive constant. Show that \(A = (1-\e^{-\lambda})^{-1}\,\). Find the mean \({\mu}\) in terms of \({\lambda}\) and show that $$ \var(X) = {\mu}(1-{\mu}+{\lambda})\;. $$ Deduce that \({\lambda} < {\mu} < 1+{\lambda}\,\). Use a normal approximation to find the value of \(P(X={\lambda})\) in the case where \({\lambda}=100\,\), giving your answer to 2 decimal places.


Solution: Let \(Y \sim Po(\lambda)\) \begin{align*} && 1 &= \sum_{k=1}^\infty \mathbb{P}(X = k ) \\ &&&= \sum_{k=1}^\infty A \frac{\lambda^k e^{-\lambda}}{k!}\\ &&&= Ae^{-\lambda} \sum_{k=1}^{\infty} \frac{\lambda^k e^{-\lambda}}{k!} \\ &&&= Ae^{-\lambda} \left (e^{\lambda}-1 \right) \\ \Rightarrow && A &= (1-e^{-\lambda})^{-1} \\ \\ && \E[X] &= \sum_{k=1}^{\infty} k \cdot \mathbb{P}(X=k) \\ &&&= A\sum_{k=1}^{\infty} k \frac{\lambda^k e^{-\lambda}}{k!} \\ &&&= A\E[Y] = A\lambda = \lambda(1-e^{-\lambda})^{-1} \\ \\ && \var[X] &= \E[X^2] - (\E[X])^2 \\ &&&= A\sum_{k=1}^{\infty} k^2 \frac{\lambda^k e^{-\lambda}}{k!} - \mu^2 \\ &&&= A\E[Y^2] - \mu^2 \\ &&&= A(\var[Y]+\lambda^2) - \mu^2 \\ &&&= A(\lambda + \lambda^2) - \mu^2 \\ &&&= A\lambda(1+\lambda) - \mu^2 \\ &&&= \mu(1+\lambda - \mu) \end{align*} Since \(A > 1\) we must have \(\mu > \lambda\) and since \(\var[X] > 0\) we must have \(1 + \lambda > \mu\) as required. If \(\lambda = 100\), then \(A \approx 1\) and \(P(X=\lambda) \approx P(Y = \lambda)\) and \(Y \approx N(\lambda, \lambda)\) so the value is approximately \(\displaystyle \int_{-\frac12}^{\frac12} \frac{1}{\sqrt{2\pi \lambda}} e^{-\frac{x^2}{2\lambda}} \d x \approx \frac{1}{\sqrt{200\pi}} = \frac{1}{\sqrt{630.\ldots}} \approx \frac{1}{25} = 0.04 \)

1999 Paper 2 Q13
D: 1600.0 B: 1484.0

A stick is broken at a point, chosen at random, along its length. Find the probability that the ratio, \(R\), of the length of the shorter piece to the length of the longer piece is less than \(r\). Find the probability density function for \(R\), and calculate the mean and variance of \(R\).


Solution: Let \(X \sim U[0, \tfrac12]\) be the shorter piece, so \(R = \frac{X}{1-X}\), and \begin{align*} && \mathbb{P}(R \leq r) &= \mathbb{P}(\tfrac{X}{1-X} \leq r) \\ &&&= \mathbb{P}(X \leq r - rX) \\ &&&= \mathbb{P}((1+r)X \leq r) \\ &&&= \mathbb{P}(X \leq \tfrac{r}{1+r} ) \\ &&&= \begin{cases} 0 & r < 0 \\ \frac{2r}{1+r} & 0 \leq r \leq 1 \\ 1 & r > 1 \end{cases} \\ \\ && f_R(r) &= \begin{cases} \frac{2}{(1+r)^2} & 0 \leq r \leq 1 \\ 0 & \text{otherwise} \end{cases} \end{align*} Let \(Y \sim U[\tfrac12, 1]\) be the longer piece, then \(R = \frac{1-Y}{Y} = Y^{-1} - 1\) and \begin{align*} \E[R] &= \int_{\frac12}^1 (y^{-1}-1) 2 \d y \\ &= 2\left [\ln y - y \right]_{\frac12}^1 \\ &= -2 + 2\ln2 +2\frac12 \\ &= 2\ln2 -1 \\ \\ \E[R^2] &= \int_{\frac12}^1 (y^{-1}-1)^2 2 \d y\\ &= 2\left [-y^{-1} -2\ln y + 1 \right]_{\frac12}^1 \\ &= 2 \left ( 2 - 2\ln 2+\frac12\right) \\ &= 3-4\ln 2 \\ \var[R] &= 3 - 4 \ln 2 -(2\ln 2-1)^2 \\ &= 2 - 4(\ln 2)^2 \end{align*}

1994 Paper 1 Q14
D: 1500.0 B: 1532.7

Each of my \(n\) students has to hand in an essay to me. Let \(T_{i}\) be the time at which the \(i\)th essay is handed in and suppose that \(T_{1},T_{2},\ldots,T_{n}\) are independent, each with probability density function \(\lambda\mathrm{e}^{-\lambda t}\) (\(t\geqslant0\)). Let \(T\) be the time I receive the first essay to be handed in and let \(U\) be the time I receive the last one.

  1. Find the mean and variance of \(T_{i}.\)
  2. Show that \(\mathrm{P}(U\leqslant u)=(1-\mathrm{e}^{-\lambda u})^{n}\) for \(u\geqslant0,\) and hence find the probability density function of \(U\).
  3. Obtain \(\mathrm{P}(T>t),\) and hence find the probability density function of \(T\).
  4. Write down the mean and variance of \(T\).


Solution:

  1. \(T_i \sim \textrm{Exp}(\lambda)\) so \(\E[T_i] = \lambda^{-1}, \var[T_i] = \lambda^{-2}\)
  2. \(\,\) \begin{align*} && \mathbb{P}(U \leq u) &= \mathbb{P}(T_i \leq u\quad \forall i) \\ &&&= \prod \mathbb{P}(T_i \leq u) \\ &&&= \prod \int_0^u \lambda e^{-\lambda t} \d t \\ &&&= (1-e^{-\lambda u})^n \\ \\ \Rightarrow && f_U(u) &= n\lambda e^{-\lambda u}(1-e^{-\lambda u})^{n-1} \end{align*}
  3. \(\,\) \begin{align*} && \mathbb{P}(T > t) &= \mathbb{P}(T_i > t \quad \forall i) \\ &&&= \prod \mathbb{P}(T_i > t) \\ &&&= e^{-n\lambda t} \\ \Rightarrow && f_T(t) &= n\lambda e^{-n\lambda t} \end{align*}
  4. Therefore \(\E[T] = \frac{1}{n\lambda}, \var[T] = \frac{1}{(n\lambda)^2}\)

1991 Paper 1 Q15
D: 1516.0 B: 1484.0

A fair coin is thrown \(n\) times. On each throw, 1 point is scored for a head and 1 point is lost for a tail. Let \(S_{n}\) be the points total for the series of \(n\) throws, i.e. \(S_{n}=X_{1}+X_{2}+\cdots+X_{n},\) where \[ X_{j}=\begin{cases} 1 & \text{ if the }j \text{ th throw is a head}\\ -1 & \text{ if the }j\text{ th throw is a tail.} \end{cases} \]

  1. If \(n=10\,000,\) find an approximate value for the probability that \(S_{n}>100.\)
  2. Find an approximate value for the least \(n\) for which \(\mathrm{P}(S_{n}>0.01n)<0,01.\)
Suppose that instead no points are scored for the first throw, but that on each successive throw, 2 points are scored if both it and the first throw are heads, two points are deducted if both are tails, and no points are scored or lost if the throws differ. Let \(Y_{k}\) be the score on the \(k\)th throw, where \(2\leqslant k\leqslant n.\) Show that \(Y_{k}=X_{1}+X_{k}.\) Calculate the mean and variance of each \(Y_{k}\) and determine whether it is true that \[ \mathrm{P}(Y_{2}+Y_{3}+\cdots+Y_{n}>0.01(n-1))\rightarrow0\quad\mbox{ as }n\rightarrow\infty. \]


Solution: Notice that \(\mathbb{E}(X_i) = 0, \mathbb{E}(X_i^2) = 1\) and so \(\mathbb{E}(S_n) =0, \textrm{Var}(S_n) = n\).

  1. Then by the central limit theorem (or alternatively the normal approximation to the binomial), \begin{align*} && \mathbb{P}(S_n > 100) &\underbrace{\approx}_{\text{CLT}} \mathbb{P} \left (Z > \frac{100}{\sqrt{10\, 000}} \right) \\ &&&= \mathbb{P}(Z > 1) \\ &&&= 1-\Phi(1) \\ &&&\approx 15.9\% \end{align*}
  2. \begin{align*} &&\mathbb{P}(S_n > 0.01n) &\approx \mathbb{P} \left (Z > \frac{0.01n}{\sqrt{n}} \right) \\ &&&= \mathbb{P}(Z > 0.01 \sqrt{n}) \\ &&&= 1-\Phi(0.01\sqrt{n}) \\ &&&< 0.01 \\ && \Phi^{-1}(0.01) &= -2.3263\ldots \\ \Rightarrow && 0.01 \sqrt{n} &= 2.3263\ldots \\ \Rightarrow && n &\approx 233^2 \end{align*}
\begin{array}{cc|cc} 1\text{st throw}& k\text{th throw} & X_1 + X_k & Y_k \\ \hline \text{head} & \text{head} & 1 + 1 & 2 \\ \text{head} & \text{tail} & 1 - 1 & 0 \\ \text{tail} & \text{head} & -1 + 1 & 0 \\ \text{tail} & \text{tail} & -1- 1 & -2 \\ \end{array} Across all possible cases \(Y_k = X_1 + X_k\) so therefore these random variables are equal. \begin{align*} \mathbb{E}(Y_k) &= \mathbb{E}(X_1) + \mathbb{E}(Y_k) \\ &= 0 + 0 = 0 \\ \\ \textrm{Var}(Y_k) &= \textrm{Var}(X_1)+\textrm{Var}(X_k) \\ &= 2 \\ \\ \mathbb{E}\left (\sum_{k=2}^n Y_k \right) &= 0 \\ \textrm{Var}\left (\sum_{k=2}^n Y_k \right) &= 2(n-1) \end{align*} Therefore approximately \(\displaystyle \sum_{k=2}^n Y_k \approx N(0, 2(n-1))\) \begin{align*} \mathbb{P} \left (\sum_{k=2}^n Y_k > 0.01(n-1) \right) &\approx \mathbb{P} \left (Z > \frac{0.01(n-1)}{\sqrt{2(n-1)}} \right) \\ &= \mathbb{P} \left (Z > c \sqrt{n-1} \right) \\ &\to 0 \text{ as } n \to \infty \end{align*}

1987 Paper 1 Q15
D: 1500.0 B: 1516.7

A point \(P\) is chosen at random (with uniform distribution) on the circle \(x^{2}+y^{2}=1\). The random variable \(X\) denotes the distance of \(P\) from \((1,0)\). Find the mean and variance of \(X\). Find also the probability that \(X\) is greater than its mean.


Solution: Consider the angle from the origin, then \(P = (\cos \theta, \sin \theta)\) where \(\theta \sim U(0, 2\pi)\), and \(X = \sqrt{(\cos \theta - 1)^2 + \sin^2 \theta}\) \begin{align*} \mathbb{E}[X] &= \int_0^{2\pi} \sqrt{(\cos \theta - 1)^2 + \sin^2 \theta} \frac1{2\pi} \d \theta \\ &= \frac1{2\pi}\int_0^{2\pi} \sqrt{2 - 2\cos \theta} \d \theta \\ &= \frac{1}{2\pi}\int_0^{2\pi} \sqrt{4\sin^2 \frac{\theta}{2}} \d \theta \\ &= \frac{1}{\pi}\int_0^{2\pi} \left |\sin \frac{\theta}{2} \right| \d \theta \\ &= \frac{1}{\pi} \left [ -2\cos \frac{\theta}{2} \right]_0^{2\pi} \\ &= \frac1{\pi} \l 2 + 2\r \\ &= \frac{4}{\pi} \end{align*} \begin{align*} \mathbb{E}(X^2) &= \frac1{2\pi}\int_0^{2\pi} (\cos \theta - 1)^2 + \sin^2 \theta \d \theta \\ &= \frac1{2\pi}\int_0^{2\pi} 2 - 2 \cos \theta \d \theta \\ &= \frac{4\pi}{2\pi} \\ &= 2 \\ \end{align*} \(\Rightarrow\) \(\mathrm{Var}(X) = \mathbb{E}(X^2) - \mathbb{E}(X)^2 = 2 - \frac{16}{\pi^2} = \frac{2\pi^2 - 16}{\pi^2}\).

TikZ diagram
Where the line makes a length longer than \(\frac{4}{\pi}\) it will make an angle at the origin of \(2\sin^{-1} \frac{2}{\pi}\). Therefore the probability of being larger than this is \(\frac{2\pi - 2 \times 2\sin^{-1} \frac{2}{\pi}}{2 \pi} = 1 - \frac{2}{\pi} \sin^{-1} \frac{2}{\pi} \approx 0.560\)