Problems

Filters
Clear Filters

2 problems found

2015 Paper 3 Q13
D: 1700.0 B: 1500.0

Each of the two independent random variables \(X\) and \(Y\) is uniformly distributed on the interval~\([0,1]\).

  1. By considering the lines \(x+y =\) \(\mathrm{constant}\) in the \(x\)-\(y\) plane, find the cumulative distribution function of \(X+Y\).
  2. Hence show that the probability density function \(f\) of \((X+Y)^{-1}\) is given by \[ \f(t) = \begin{cases} 2t^{-2} -t^{-3} & \text{for \( \tfrac12 \le t \le 1\)} \\ t^{-3} & \text{for \(1\le t <\infty\)}\\ 0 & \text{otherwise}. \end{cases} \] Evaluate \(\E\Big(\dfrac1{X+Y}\Big)\,\).
  3. Find the cumulative distribution function of \(Y/X\) and use this result to find the probability density function of \(\dfrac X {X+Y}\). Write down \(\E\Big( \dfrac X {X+Y}\Big)\) and verify your result by integration.


Solution:

  1. \(\mathbb{P}(X + Y \leq c) \) is the area between the \(x\)-axis, \(y\)-axis and the line \(x + y = c\). There are two cases for this: \[\mathbb{P}(X + Y \leq c) = \begin{cases} 0 & \text{ if } c \leq 0 \\ \frac{c^2}{2} & \text{ if } c \leq 1 \\ 1- \frac{(2-c)^2}{2} & \text{ if } 1 \leq c \leq 2 \\ 1 & \text{ otherwise} \end{cases}\]
  2. \begin{align*} && \mathbb{P}((X + Y)^{-1} \leq t) &= 1- \mathbb{P}(X + Y \leq \frac1{t}) \\ \Rightarrow && f_{(X+Y)^{-1}}(t) &= 0 -\begin{cases} 0 & \text{ if } \frac1{t} \leq 0 \\ \frac{\d}{\d t}\frac{1}{2t^2} & \text{ if } \frac{1}{t} \leq 1 \\ \frac{\d}{\d t} \l 1- \frac{(2-\frac1t)^2}{2} \r & \text{ if } 1 \leq \frac{1}{t} \leq 2 \\ 0 & \text{ otherwise}\end{cases} \\ && &= \begin{cases} t^{-3} & \text{ if } t \geq 1 \\ (2-\frac1t)t^{-2} & \text{ if } \frac12 \leq t \leq 1\\ 0 & \text{ otherwise}\end{cases} \\ && &= \begin{cases} t^{-3} & \text{ if } t \geq 1 \\ 2t^{-2}-t^{-3} & \text{ if } \frac12 \leq t \leq 1\\ 0 & \text{ otherwise}\end{cases} \end{align*} Therefore, \begin{align*} \E \Big(\dfrac1{X+Y}\Big) &= \int_{\frac12}^{\infty} t f_{(X+Y)^{-1}}(t) \, \d t \\ &= \int_{\frac12}^{1} t f_{(X+Y)^{-1}}(t) \, \d t + \int_{1}^{\infty} t f_{(X+Y)^{-1}}(t) \d t\\ &= \int_{\frac12}^{1} \l 2t^{-1} - t^{-2} \r \, \d t + \int_{1}^{\infty} t^{-2} \d t\\ &= \left [ 2 \ln (t) + t^{-1} \right]_{\frac12}^{1} + \left [ -t^{-1} \right ]_{1}^{\infty} \\ &= 1 + 2 \ln 2 -2 + 1 \\ &= 2 \ln 2 \end{align*}
  3. \begin{align*} &&\mathbb{P} \l \frac{Y}{X} \leq c \r &= \mathbb{P}( Y \leq c X) \\ &&&= \begin{cases} 0 & \text{if } c \leq 0 \\ \frac{c}{2} & \text{if } 0 \leq c \leq 1 \\ 1-\frac{1}{2c} & \text{if } 1 \leq c \end{cases} \\ \\ \Rightarrow && \mathbb{P} \l \frac{X}{X+Y} \leq t\r &= \mathbb{P} \l \frac{1}{1+\frac{Y}{X}} \leq t\r \\ &&&= \mathbb{P} \l \frac{1}{t} \leq 1+\frac{Y}{X}\r \\ &&&= \mathbb{P} \l \frac{1}{t} - 1\leq \frac{Y}{X}\r \\ &&&= 1- \mathbb{P} \l \frac{Y}{X} \leq \frac{1}{t} - 1\r \\ &&&= 1 - \begin{cases} 0 & \text{if } \frac1{t} \leq 0 \\ \frac{1}{2t} - \frac{1}{2} & \text{if } 0 \leq \frac1{t} \leq 1 \\ 1-\frac{t}{2-2t} & \text{if } 1 \leq \frac1{t} \end{cases} \\ && f_{\frac{X}{X+Y}}(t) &= \begin{cases} 0 & \text{if } \frac1{t} \leq 0 \\ \frac{1}{2t^2} & \text{if } t \geq 1 \\ \frac{1}{2(1-t)^2} & \text{if } 0 \leq t \leq 1 \end{cases} \\ \Rightarrow && \mathbb{E} \l \frac{X}{X+Y} \r &= \int_0^\infty t f(t) \d t \\ &&&= \int_0^1 \frac{1}{2(1-t)^2} \d t + \int_1^\infty \frac{1}{t^2} \d t \\ &&& = \frac{1}{4} + \frac{1}{4} = \frac{1}{2} \\ \\ && \mathbb{E} \l \frac{X}{X+Y} \r &= \int_0^1 \int_0^1 \frac{x}{x+y} \d y\d x \\ &&&= \int_0^1 \l x \ln (x+1) - x \ln x \r \d x \\ &&&= \left [\frac{x^2}2 \ln(x+1) - \frac{x^2}{2} \ln(x) \right]_0^1 -\int_0^1 \l \frac{x^2}{2(x+1)} - \frac{x}{2} \r \d x \\ &&&= \frac{\ln 2}{2} + \frac{1}{4} - \int_0^1 \frac{x^2-1+1}{2(x+1)}\d x \\ &&&= \frac{\ln 2}{2} + \frac{1}{4} - \int_0^1 \frac{x -1}{2} + \frac{1}{2(x+1)}\d x \\ &&&= \frac{\ln 2}{2} + \frac{1}{4} - \frac{1}{4} + \frac{1}{2} - \frac{\ln 2}{2} \\ &&&= \frac{1}{2} \end{align*} We can also notice that \(1 = \mathbb{E} \l \frac{X+Y}{X+Y} \r = \mathbb{E} \l \frac{X}{X+Y} \r + \mathbb{E} \l \frac{Y}{X+Y} \r = 2 \mathbb{E} \l \frac{X}{X+Y} \r\) so it's clearly true as long as we can show that the integral converges.

2014 Paper 3 Q12
D: 1700.0 B: 1500.0

The random variable \(X\) has probability density function \(f(x)\) (which you may assume is differentiable) and cumulative distribution function \(F(x)\) where \(-\infty < x < \infty \). The random variable \(Y\) is defined by \(Y= \e^X\). You may assume throughout this question that \(X\) and \(Y\) have unique modes.

  1. Find the median value \(y_m\) of \(Y\) in terms of the median value \(x_m\) of \(X\).
  2. Show that the probability density function of \(Y\) is \(f(\ln y)/y\), and deduce that the mode \(\lambda\) of \(Y\) satisfies \(\f'(\ln \lambda) = \f(\ln \lambda)\).
  3. Suppose now that \(X \sim {\rm N} (\mu,\sigma^2)\), so that \[ f(x) = \frac{1}{\sigma \sqrt{2\pi}\,} \e^{-(x-\mu)^2/(2\sigma^2)} \,. \] Explain why \[\frac{1}{\sigma \sqrt{2\pi}\,} \int_{-\infty}^{\infty}\e^{-(x-\mu-\sigma^2)^2/(2\sigma^2)} \d x = 1 \] and hence show that \( \E(Y) = \e ^{\mu+\frac12\sigma^2}\).
  4. Show that, when \(X \sim {\rm N} (\mu,\sigma^2)\), \[ \lambda < y_m < \E(Y)\,. \]


Solution:

  1. \begin{align*} && \frac12 &= \mathbb{P}(X \leq x_m) \\ \Leftrightarrow && \frac12 &= \mathbb{P}(e^X \leq e^{x_m} = y_m) \end{align*} Therefore the median is \(y_m = e^{x_m}\)
  2. \begin{align*} && \mathbb{P}(Y \leq y) &= \mathbb{P}(e^X \leq y) \\ &&&= \mathbb{P}(X \leq \ln y) \\ &&&= F(\ln y) \\ \Rightarrow && f_Y(y) &= f(\ln y)/y \\ \\ && f'_Y(y) &= \frac{f'(\ln y) - f(\ln y)}{y^2} \end{align*} Therefore since the mode satisfies \(f'_Y = 0\) we must have \(f'(\ln \lambda ) = f(\ln \lambda)\)
  3. This is the integral of the pdf of \(N(\mu + \sigma^2, \sigma^2)\) and therefore is clearly \(1\). \begin{align*} && \E[Y] &= \int_{-\infty}^{\infty} e^x \cdot \frac{1}{\sqrt{2\pi \sigma^2}} e^{-(x-\mu)^2/(2\sigma^2)} \d x \\ &&&= \frac{1}{\sqrt{2\pi \sigma^2}} \int_{-\infty}^{\infty} \exp (x - (x-\mu)^2/(2\sigma^2)) \d x\\ &&&= \frac{1}{\sqrt{2\pi \sigma^2}} \int_{-\infty}^{\infty} \exp ((2x \sigma^2- (x-\mu)^2)/(2\sigma^2)) \d x\\ &&&= \frac{1}{\sqrt{2\pi \sigma^2}} \int_{-\infty}^{\infty} \exp (-(x-\mu-\sigma^2)^2+2\mu \sigma^2-\sigma^4)/(2\sigma^2)) \d x\\ &&&= \frac{1}{\sqrt{2\pi \sigma^2}} \int_{-\infty}^{\infty} \exp (-(x-\mu+\sigma^2)^2)/(2\sigma^2)+\mu +\frac12\sigma^2) \d x\\ &&&= \e^{\mu +\frac12\sigma^2}\frac{1}{\sqrt{2\pi \sigma^2}} \int_{-\infty}^{\infty} \exp (-(x-\mu-\sigma^2)^2)/(2\sigma^2)) \d x\\ &&&= \e^{\mu +\frac12\sigma^2} \end{align*}
  4. Notice that \(y_m = e^\mu < e^{\mu + \tfrac12 \sigma^2} = \E[Y]\), so it suffices to prove that \(\lambda < e^{\mu}\) Notice that \(f'(x) - f(x) = f(x)[-(x-\mu)/\sigma^2 - 1]\) and therefore \(\ln y - \mu = -\sigma^2\) so \(\lambda = e^{\mu - \sigma^2}\) which is clearly less than \(e^{\mu}\) as required.