Year: 2012
Paper: 3
Question Number: 13
Course: LFM Stats And Pure
Section: Normal Distribution
The number of candidates attempting more than six questions was, as last year, about 25%, though most of these extra attempts achieved little credit.
Difficulty Rating: 1700.0
Difficulty Comparisons: 0
Banger Rating: 1484.0
Banger Comparisons: 1
\begin{questionparts}
\item The random variable $Z$ has a
Normal distribution with mean $0$ and variance $1$.
Show that the expectation of $Z$ given that $a < Z < b$ is
\[
\frac{\exp(- \frac12 a^2) - \exp(- \frac12 b^2) }
{\sqrt{2\pi\,} \,\big(\Phi(b) - \Phi(a)\big)},
\]
where $\Phi$ denotes the cumulative distribution function for $Z$.
\item The random variable $X$ has a Normal distribution with mean $\mu$ and variance $\sigma^2$.
Show that
\[
\E(X \,\vert\, X>0) = \mu + \sigma \E(Z \,\vert\,Z > -\mu/\sigma).
\]
Hence, or otherwise,
show that the expectation, $m$, of $\vert X\vert $ is given by
\[
m=
\mu \big(1 - 2 \Phi(- \mu / \sigma)\big)
+
\sigma \sqrt{2 / \pi}\; \exp(- \tfrac12 \mu^2 / \sigma^2)
\,.
\]
Obtain an expression for the variance
of $\vert X \vert$ in terms of $\mu $, $\sigma $ and $m$.
\end{questionparts}
\begin{questionparts}
\item $\,$ \begin{align*}
&& \mathbb{E}(Z| a < Z < b) &= \mathbb{E}(Z\mathbb{1}_{(a,b)}) /\mathbb{E}(\mathbb{1}_{(a,b)}) \\
&&&= \int_a^b z \phi(z) \d z \Big / (\Phi(b) - \Phi(a)) \\
&&&= \frac{\int_a^b \frac{1}{\sqrt{2 \pi}}z e^{-\frac12 z^2} \d z}{\Phi(b) - \Phi(a)} \\
&&&= \frac{\frac1{\sqrt{2\pi}} \left [-e^{-\frac12 z^2} \right]_a^b}{\Phi(b) - \Phi(a)} \\
&&&= \frac{\frac1{\sqrt{2\pi}} \left (e^{-\frac12 a^2}-e^{-\frac12 b^2} \right)}{\Phi(b) - \Phi(a)} \\
\end{align*}
\item $\,$ \begin{align*}
&& \mathbb{E}(X |X > 0) &= \mathbb{E}(\mu + \sigma Z | \mu + \sigma Z > 0) \\
&&&= \mathbb{E}(\mu + \sigma Z | Z > -\tfrac{\mu}{\sigma}) \\
&&&= \mathbb{E}(\mu| Z > -\tfrac{\mu}{\sigma})+ \sigma \mathbb{E}(Z | Z > -\tfrac{\mu}{\sigma})\\
&&&= \mu+ \sigma \mathbb{E}(Z | Z > -\tfrac{\mu}{\sigma})\\
\end{align*}
Hence \begin{align*}
&&\mathbb{E}(|X|) &= \mathbb{E}(X | X > 0)\mathbb{P}(X > 0) - \mathbb{E}(X | X < 0)\mathbb{P}(X < 0) \\
&&&=\left ( \mu+ \sigma \mathbb{E}(Z | Z > -\mu /\sigma)\right)(1-\Phi(-\mu/\sigma)) - \left ( \mu+ \sigma \mathbb{E}(Z | Z < -\mu /\sigma)\right)\Phi(-\mu/\sigma) \\
&&&= \mu(1 - 2\Phi(-\mu/\sigma)) + \sigma \frac{e^{-\frac12\mu^2/\sigma^2}}{\sqrt{2\pi}(1-\Phi(-\mu/\sigma))}(1-\Phi(-\mu/\sigma)) + \sigma \frac{e^{-\frac12\mu^2/\sigma^2}}{\sqrt{2 \pi} \Phi(-\mu/\sigma)} \Phi(-\mu/\sigma) \\
&&&= \mu(1 - 2\Phi(-\mu/\sigma)) + \sigma \sqrt{\frac{2}{\pi}} \exp(-\tfrac12 \mu^2/\sigma^2)
\end{align*}
Finally, \begin{align*}
&& \textrm{Var}(|X|) &= \mathbb{E}(|X|^2) - [\mathbb{E}(|X|)]^2 \\
&&&= \mu^2 + \sigma^2 - m^2
\end{align*}
\end{questionparts}
Even fewer attempted this than question 9. It was the second least successfully attempted question. Generally, part (i) was reasonably attempted although a number of attempts were very unconvincing as candidates failed to approach this as conditional probability. Hardly any got properly to grips with the second part, though some cashed in with the final variance result.