Problems

Filters
Clear Filters

4 problems found

2008 Paper 3 Q12
D: 1700.0 B: 1516.0

Let \(X\) be a random variable with a Laplace distribution, so that its probability density function is given by \[ \f(x) = \frac12 \e^{-\vert x \vert }\;, \text{ \(-\infty < x < \infty \)}. \tag{\(*\)} \] Sketch \(\f(x)\). Show that its moment generating function \({\rm M}_X(\theta)\) is given by \({\rm M}_X(\theta)= (1-\theta^2)^{-1}\) and hence find the variance of \(X\). A frog is jumping up and down, attempting to land on the same spot each time. In fact, in each of \(n\) successive jumps he always lands on a fixed straight line but when he lands from the \(i\)th jump (\(i=1\,,2\,,\ldots\,,n\)) his displacement from the point from which he jumped is \(X_i\,\)cm, where \(X_i\) has the distribution \((*)\). His displacement from his starting point after \(n\) jumps is \(Y\,\)cm (so that \(Y=\sum\limits_{i=1}^n X_i\)). Each jump is independent of the others. Obtain the moment generating function for \(Y/ \sqrt {2n}\) and, by considering its logarithm, show that this moment generating function tends to \(\exp(\frac12\theta^2)\) as \(n\to\infty\). Given that \(\exp(\frac12\theta^2)\) is the moment generating function of the standard Normal random variable, estimate the least number of jumps such that there is a \(5\%\) chance that the frog lands 25 cm or more from his starting point.


Solution:

TikZ diagram
\begin{align*} && M_X(\theta) &= \E \left [ e^{\theta X} \right] \\ &&&= \int_{-\infty}^{\infty} e^{\theta x} f(x) \d x \\ &&&= \int_{-\infty}^0 e^{\theta x}\frac12 e^{x} \d x+ \int_0^{\infty} e^{\theta x} \frac12 e^{-x} \d x \\ &&&= \frac12 \left [ \frac{1}{1+\theta}e^{(1+\theta)x} \right]_{-\infty}^0 +\frac12 \left [ \frac{1}{\theta-1}e^{(\theta-1)x} \right]_{0}^{\infty} \\ &&&= \frac12 \left ( \frac{1}{1+ \theta} + \frac{1}{1-\theta} \right) \\ &&&= \frac{1}{1-\theta^2} = (1-\theta^2)^{-1} \\ &&&= 1 + \theta^2 + \theta^4 + \cdots \\ \\ \Rightarrow && \E[X] &= 0 \\ && \E[X^2] &= 2 \\ \Rightarrow && \var[X] &= 2 \end{align*} \begin{align*} && M_{Y/\sqrt{2n}}(\theta) &= \E \left [ \exp \left ( \theta \frac{Y}{\sqrt{2n}} \right) \right] \\ &&&= \E \left [ \exp \left ( \frac{\theta }{\sqrt{2n}} \sum_{i=1}^n X_i \right) \right] \\ &&&= \E \left [ \prod_{i=1}^n \exp \left ( \frac{\theta }{\sqrt{2n}} X_i \right) \right] \\ &&&= \prod_{i=1}^n \E \left [\exp \left ( \frac{\theta }{\sqrt{2n}} X_i \right) \right] \tag{independence}\\ &&&= \prod_{i=1}^n M_{X_i} \left ( \frac{\theta }{\sqrt{2n}} \right)\\ &&&= \prod_{i=1}^n M_{X} \left ( \frac{\theta }{\sqrt{2n}} \right)\\ &&&= M_{X} \left ( \frac{\theta }{\sqrt{2n}} \right)^n\\ &&&= \left (1 - \frac{\theta^2}{2n} \right)^{-n} \to \exp(\tfrac12 \theta^2) \end{align*} Given that \(M_{Y/\sqrt{2n}} \to M_Z\) we assume that \(Y/\sqrt{2n} \to Z\) or \(Y/\sqrt{2n} \approx Z\). \begin{align*} && 5\% &\approx \mathbb{P}(|Z| > 2) \\ &&&\approx \mathbb{P} \left (|Y| > 2\sqrt{2n} \right) \end{align*} So we wish to choose \(n\) such that \(2\sqrt{2n} = 25\) or \(n = \frac{625}8 \approx 78\) so take \(n = 79\)

1999 Paper 3 Q12
D: 1700.0 B: 1500.0

In the game of endless cricket the scores \(X\) and \(Y\) of the two sides are such that \[ \P (X=j,\ Y=k)=\e^{-1}\frac{(j+k)\lambda^{j+k}}{j!k!},\] for some positive constant \(\lambda\), where \(j,k = 0\), \(1\), \(2\), \(\ldots\).

  1. Find \(\P(X+Y=n)\) for each \(n>0\).
  2. Show that \(2\lambda \e^{2\lambda-1}=1\).
  3. Show that \(2x \e^{2x-1}\) is an increasing function of \(x\) for \(x>0\) and deduce that the equation in (ii) has at most one solution and hence determine \(\lambda\).
  4. Calculate the expectation \(\E(2^{X+Y})\).


Solution:

  1. \begin{align*} && \mathbb{P}(X+Y = n) &= \sum_{i = 0}^n \mathbb{P}(X = i, Y = n-i) \\ &&&= \sum_{i = 0}^n e^{-1} \frac{n \lambda^n}{i! (n-i)!} \\ &&&=e^{-1} n \lambda^n \sum_{i = 0}^n\frac{1}{i! (n-i)!} \\ &&&=\frac{e^{-1} n}{n!} \lambda^n \sum_{i = 0}^n\frac{n!}{i! (n-i)!} \\ &&&= \frac{n\lambda^n}{e n!} 2^n \\ &&&= \frac{n (2 \lambda)^n}{e \cdot n!} \end{align*}
  2. \begin{align*} && 1 &= \sum_{n = 0}^{\infty} \mathbb{P}(X+Y =n ) \\ &&&= \sum_{n = 0}^{\infty}\frac{n (2 \lambda)^n}{e \cdot n!} \\ &&&= \sum_{n = 1}^\infty \frac{ (2 \lambda)^n}{e \cdot (n-1)!} \\ &&&= \frac{2 \lambda}{e}\sum_{n = 0}^\infty \frac{ (2 \lambda)^n}{n!} \\ &&&= \frac{2 \lambda}{e} e^{2\lambda} \\ &&&= 2 \lambda e^{2\lambda - 1} \end{align*} \\
  3. Consider \(f(x) = 2xe^{2x-1}\), then \begin{align*} && f'(x) &= 2e^{2x-1} + 2xe^{2x-1} \cdot 2 \\ &&&= e^{2x-1} (2 + 4x) > 0 \end{align*} Therefore \(f(x)\) is an increasing function of \(x\), which means \(f(x) = 1\) has at most one solution for \(\lambda\). Therefore \(\lambda = \frac12\)
  4. \begin{align*} \mathbb{E}(2^{X+Y}) &= \sum_{n = 0}^\infty \mathbb{P}(X+Y = n) 2^n \\ &= \sum_{n = 1}^\infty \frac{1}{e(n-1)!} 2^{n} \\ &= \frac{2}{e} \sum_{n=0}^\infty \frac{2^n}{n!} \\ &= \frac{2}{e} e^2 \\ &= 2e \end{align*}

1997 Paper 3 Q13
D: 1700.0 B: 1500.0

Let \(X\) and \(Y\) be independent standard normal random variables: the probability density function, \(\f\), of each is therefore given by \[ \f(x)=\left(2\pi\right)^{-\frac{1}{2}}\e^{-\frac{1}{2}x^{2}}. \]

  1. Find the moment generating function \(\mathrm{E}(\e^{\theta X})\) of \(X\).
  2. Find the moment generating function of \(aX+bY\) and hence obtain the condition on \(a\) and \(b\) which ensures that \(aX+bY\) has the same distribution as \(X\) and \(Y\).
  3. Let \(Z=\e^{\mu+\sigma X}\). Show that \[ \mathrm{E}(Z^{\theta})=\e^{\mu\theta+\frac{1}{2}\sigma^{2}\theta^{2}}, \] and hence find the expectation and variance of \(Z\).


Solution:

  1. \(\,\) \begin{align*} && \E[e^{\theta X}] &= \int_{-\infty}^{\infty} e^{\theta x} \frac{1}{\sqrt{2\pi}} e^{-\frac12 x^2 } \d x\\ &&&= \int_{-\infty}^{\infty} \frac{1}{\sqrt{2\pi}} e^{-\frac12 x^2+\theta x} \d x\\ &&&= \int_{-\infty}^{\infty} \frac{1}{\sqrt{2\pi}} e^{-\frac12 (x^2-2\theta x)} \d x\\ &&&= \int_{-\infty}^{\infty} \frac{1}{\sqrt{2\pi}} e^{-\frac12 (x-\theta )^2+\frac12\theta^2 } \d x\\ &&&= e^{\frac12\theta^2 }\int_{-\infty}^{\infty} \frac{1}{\sqrt{2\pi}} e^{-\frac12 (x-\theta )^2 } \d x\\ &&&=e^{\frac12\theta^2 } \end{align*}
  2. \begin{align*} && M_{aX+bY} (\theta) &= \mathbb{E}[e^{\theta (aX+bY)}] \\ &&&= e^{\frac12(a\theta)^2} \cdot e^{\frac12(b\theta)^2} \\ &&&= e^{\frac12(a^2+b^2)\theta^2} \end{align*} Therefore we need \(a^2+b^2 = 1\)
  3. \(\,\) \begin{align*} && \E[Z^\theta] &= \E[e^{\mu \theta + \sigma \theta X}] \\ &&&= e^{\mu \theta}e^{\frac12 \sigma^2 \theta^2} \\ &&&=e^{\mu \theta + \frac12 \sigma^2 \theta^2} \\ \end{align*} \begin{align*} \mathbb{E}(Z) &= \mathbb{E}[Z^1] \\ &= e^{\mu + \frac12 \sigma^2} \\ \var[Z] &= \E[Z^2] - \left ( \E[Z] \right)^2 \\ &= e^{2 \mu+ 2\sigma^2} - e^{2\mu + \sigma^2} \\ &= e^{2\mu+\sigma^2} \left (e^{\sigma^2}-1 \right) \end{align*} [NB: This is the lognormal distribution]

1995 Paper 3 Q14
D: 1700.0 B: 1516.0

A candidate finishes examination questions in time \(T\), where \(T\) has probability density function \[ \mathrm{f}(t)=t\mathrm{e}^{-t}\qquad t\geqslant0, \] the probabilities for the various questions being independent. Find the moment generating function of \(T\) and hence find the moment generating function for the total time \(U\) taken to finish two such questions. Show that the probability density function for \(U\) is \[ \mathrm{g}(u)=\frac{1}{6}u^{3}\mathrm{e}^{-u}\qquad u\geqslant0. \] Find the probability density function for the total time taken to answer \(n\) such questions.


Solution: \begin{align*} && M_T(x) &= \mathbb{E}[e^{xT}] \\ &&&= \int_0^{\infty} e^{xt}te^{-t} \d t \\ &&&= \int_0^{\infty}te^{(x-1)t} \d t \\ &&&= \left [ \frac{t}{x-1} e^{(x-1)t} \right]_0^{\infty} - \int_0^\infty \frac{e^{(x-1)t}}{x-1} \d t \\ &&&= \left [ \frac{e^{(x-1)t}}{(x-1)^2} \right]_0^{\infty} \\ &&&= \frac{1}{(x-1)^2} \\ \\ && M_U(x) &= M_{T_1+T_2}(x) \\ &&&= \frac1{(x-1)^4} \\ \\ && I_n &= \int_0^{\infty} t^ne^{(x-1)t} \d t \\ &&&= \left[ \frac{1}{(x-1)}t^ne^{(x-1)t} \right]_0^{\infty} - \frac{n}{(x-1)} \int_0^{\infty}t^{n-1}e^{(x-1)t} \d t \\ &&&= -\frac{n}{(x-1)}I_{n-1} \\ \Rightarrow && I_n &= \frac{n!}{(1-x)^{n+1}} \\ \\ \Rightarrow && \int_0^{\infty} e^{xt} \frac16u^3e^{-u} \d u &= \int_0^{\infty} \frac16u^3e^{(x-1)u} \d u \\ &&&= \frac{1}{(1-x)^4} \\ \Rightarrow && f_U(u) &= \frac16u^3e^{-u} \\ \\ && M_{X_1+\cdots+X_n}(x) &= \frac{1}{(x-1)^{2n}} \\ \Rightarrow && f_{X_1+\cdots+X_n}(t) &= \frac1{(2n-1)!} t^{2n-1}e^{-t} \end{align*} (NB: This is the gamma distribution)