Problems

Filters
Clear Filters

7 problems found

2012 Paper 3 Q13
D: 1700.0 B: 1484.0

  1. The random variable \(Z\) has a Normal distribution with mean \(0\) and variance \(1\). Show that the expectation of \(Z\) given that \(a < Z < b\) is \[ \frac{\exp(- \frac12 a^2) - \exp(- \frac12 b^2) } {\sqrt{2\pi\,} \,\big(\Phi(b) - \Phi(a)\big)}, \] where \(\Phi\) denotes the cumulative distribution function for \(Z\).
  2. The random variable \(X\) has a Normal distribution with mean \(\mu\) and variance \(\sigma^2\). Show that \[ \E(X \,\vert\, X>0) = \mu + \sigma \E(Z \,\vert\,Z > -\mu/\sigma). \] Hence, or otherwise, show that the expectation, \(m\), of \(\vert X\vert \) is given by \[ m= \mu \big(1 - 2 \Phi(- \mu / \sigma)\big) + \sigma \sqrt{2 / \pi}\; \exp(- \tfrac12 \mu^2 / \sigma^2) \,. \] Obtain an expression for the variance of \(\vert X \vert\) in terms of \(\mu \), \(\sigma \) and \(m\).


Solution:

  1. \(\,\) \begin{align*} && \mathbb{E}(Z| a < Z < b) &= \mathbb{E}(Z\mathbb{1}_{(a,b)}) /\mathbb{E}(\mathbb{1}_{(a,b)}) \\ &&&= \int_a^b z \phi(z) \d z \Big / (\Phi(b) - \Phi(a)) \\ &&&= \frac{\int_a^b \frac{1}{\sqrt{2 \pi}}z e^{-\frac12 z^2} \d z}{\Phi(b) - \Phi(a)} \\ &&&= \frac{\frac1{\sqrt{2\pi}} \left [-e^{-\frac12 z^2} \right]_a^b}{\Phi(b) - \Phi(a)} \\ &&&= \frac{\frac1{\sqrt{2\pi}} \left (e^{-\frac12 a^2}-e^{-\frac12 b^2} \right)}{\Phi(b) - \Phi(a)} \\ \end{align*}
  2. \(\,\) \begin{align*} && \mathbb{E}(X |X > 0) &= \mathbb{E}(\mu + \sigma Z | \mu + \sigma Z > 0) \\ &&&= \mathbb{E}(\mu + \sigma Z | Z > -\tfrac{\mu}{\sigma}) \\ &&&= \mathbb{E}(\mu| Z > -\tfrac{\mu}{\sigma})+ \sigma \mathbb{E}(Z | Z > -\tfrac{\mu}{\sigma})\\ &&&= \mu+ \sigma \mathbb{E}(Z | Z > -\tfrac{\mu}{\sigma})\\ \end{align*} Hence \begin{align*} &&\mathbb{E}(|X|) &= \mathbb{E}(X | X > 0)\mathbb{P}(X > 0) - \mathbb{E}(X | X < 0)\mathbb{P}(X < 0) \\ &&&=\left ( \mu+ \sigma \mathbb{E}(Z | Z > -\mu /\sigma)\right)(1-\Phi(-\mu/\sigma)) - \left ( \mu+ \sigma \mathbb{E}(Z | Z < -\mu /\sigma)\right)\Phi(-\mu/\sigma) \\ &&&= \mu(1 - 2\Phi(-\mu/\sigma)) + \sigma \frac{e^{-\frac12\mu^2/\sigma^2}}{\sqrt{2\pi}(1-\Phi(-\mu/\sigma))}(1-\Phi(-\mu/\sigma)) + \sigma \frac{e^{-\frac12\mu^2/\sigma^2}}{\sqrt{2 \pi} \Phi(-\mu/\sigma)} \Phi(-\mu/\sigma) \\ &&&= \mu(1 - 2\Phi(-\mu/\sigma)) + \sigma \sqrt{\frac{2}{\pi}} \exp(-\tfrac12 \mu^2/\sigma^2) \end{align*} Finally, \begin{align*} && \textrm{Var}(|X|) &= \mathbb{E}(|X|^2) - [\mathbb{E}(|X|)]^2 \\ &&&= \mu^2 + \sigma^2 - m^2 \end{align*}

2009 Paper 3 Q12
D: 1700.0 B: 1516.0

  1. Albert tosses a fair coin \(k\) times, where \(k\) is a given positive integer. The number of heads he gets is \(X_1\). He then tosses the coin \(X_1\) times, getting \(X_2\) heads. He then tosses the coin \(X_2\) times, getting \(X_3\) heads. The random variables \(X_4\), \(X_5\), \(\ldots\) are defined similarly. Write down \(\E(X_1)\). By considering \(\E(X_2 \; \big\vert \; X_1 = x_1)\), or otherwise, show that \(\E(X_2) = \frac14 k\). Find \(\displaystyle \sum_{i=1}^\infty \E(X_i)\).
  2. Bertha has \(k\) fair coins. She tosses the first coin until she gets a tail. The number of heads she gets before the first tail is \(Y_1\). She then tosses the second coin until she gets a tail and the number of heads she gets with this coin before the first tail is \(Y_2\). The random variables \(Y_3, Y_4, \ldots\;\), \(Y_k\) are defined similarly, and \(Y= \sum\limits_{i=1}^k Y_i\,\). Obtain the probability generating function of \(Y\), and use it to find \(\E(Y)\), \(\var(Y)\) and \(\P(Y=r)\).


Solution:

  1. \(X_1 \sim B(k, \tfrac12)\) so \(\E[X_1] = \frac{k}{2}\) Note that \(X_2 | X_1 = x_1 \sim B(x_1, \tfrac12)\) so \(\E[X_2 | X_1 = x_1) = \frac{x_1}{2}\) or \(\E[X_2 | X_1] = \frac12 X_1\). Therefore by the tower law, \(\E[\E[X_2|X_1]] = \E[\frac12 X_1] = \frac14k\) Notice also that \(\E[X_n] = \frac1{2^n} k\) and so \begin{align*} && \sum_{i=1}^\infty \E[X_i] &= \sum_{i=1}^{\infty} \frac1{2^i} k \\ &&&= \frac{\frac12 k}{1-\frac12} = k \end{align*}
  2. Note that \(Y_1 \sim Geo(\tfrac12)-1\) which has generating function \(\E[t^{Y_1}] = \E[t^{G-1}] = \frac{\frac12 t}{1-(1-\frac12)t}\frac1{t} = \frac{\frac12}{1-\frac12t}\). Notice that \begin{align*} && \E \left [ t^Y \right] &= \E \left [ t^{\sum_{i=1}^kY_i} \right] \\ &&&= \prod_{i=1}^k \E[t^{Y_i}] \\ &&&= \frac{1}{(2-t)^k} \end{align*} Therefore \(\E[Y] = G'(1) = k(2-1)^{-(k+1)} = k\) \(\E[Y^2] = (tG'(t))'|_{t=1} = k(k+1)(2-1)^{-(k+2)}+k(2-1)^{-(k+1)} = k^2+2k\) so \(\var[Y] = k^2+2k - k^2 =2 k\). Finally \(\mathbb{P}(Y=r) = \binom{k+r-1}{k} \frac{1}{2^{r+k}}\)
[Note: this second distribution is a negative binomial distribution]

2007 Paper 3 Q12
D: 1700.0 B: 1487.4

I choose a number from the integers \(1, 2, \ldots, (2n-1)\) and the outcome is the random variable \(N\). Calculate \( \E(N)\) and \(\E(N^2)\). I then repeat a certain experiment \(N\) times, the outcome of the \(i\)th experiment being the random variable \(X_i\) (\(1\le i \le N\)). For each \(i\), the random variable \(X_i\) has mean \(\mu\) and variance \(\sigma^2\), and \(X_i\) is independent of \(X_j\) for \(i\ne j\) and also independent of \(N\). The random variable \(Y\) is defined by \(Y= \sum\limits_{i=1}^NX_i\). Show that \(\E(Y)=n\mu\) and that \(\mathrm{Cov}(Y,N) = \frac13n(n-1)\mu\). Find \(\var(Y) \) in terms of \(n\), \(\sigma^2\) and \(\mu\).


Solution: \begin{align*} && \E[N] &= \sum_{i=1}^{2n-1} \frac{i}{2n-1} \\ &&&= \frac{2n(2n-1)}{2(2n-1)} = n\\ && \E[N^2] &= \sum_{i=1}^{2n-1} \frac{i^2}{2n-1} \\ &&&= \frac{(2n-1)(2n)(4n-1)}{6(2n-1)} \\ &&&= \frac{n(4n-1)}{3} \\ && \var[N] &= \frac{n(4n-1)}{3} - n^2 \\ &&&= \frac{n^2-n}{3} \end{align*} \begin{align*} && \E[Y] &= \E \left [ \E \left [ \sum_{i=1}^N X_i | N = k\right] \right]\\ &&&= \E \left[ N\mu \right] = n\mu \\ \\ && \mathrm{Cov}(Y,N) &= \mathbb{E}[XY] - \E[X]\E[Y] \\ &&&= \E \left [ \E \left [N \sum_{i=1}^N X_i | N = k\right] \right] - n^2 \mu \\ &&&= \E[N^2\mu] - n^2 \mu \\ &&&= \left ( \frac{n^2(4n-1)}{3} - n^2 \right) \mu \\ &&&= \frac{n^2-n}{3}\mu \\ \\ && \E[Y^2] &= \E \left [ \E \left [ \left ( \sum_{i=1}^N X_i \right) ^2\right ] \right] \\ &&&= \E \left [ \E \left [ \sum_{i=1}^N X_i ^2 + 2\sum_{i,j} X_iX_j\right ] \right] \\ &&&= \E \left [ \sum_{i=1}^N \left ( \E[X_i ^2] + 2\sum_{i,j} \E[X_i]\E[X_j]\right ) \right] \\ &&&= \E \left [ N(\sigma^2 + \mu^2) + (N^2-N)\mu^2\right] \\ &&&= n(\sigma^2+\mu^2) + \left ( \frac{n^2-n}{3}-n \right)\mu^2 \\ &&&= n\sigma^2 + \frac{n^2-n}{3} \mu^2 \\ \Rightarrow && \var[Y] &= n\sigma^2 + \frac{n^2-n}{3} \mu^2 - n^2\mu^2 \\ &&&= n\sigma^2 - \frac{2n^2+n}{3} \mu^2 \end{align*}

2003 Paper 1 Q14
D: 1500.0 B: 1475.2

Jane goes out with any of her friends who call, except that she never goes out with more than two friends in a day. The number of her friends who call on a given day follows a Poisson distribution with parameter \(2\). Show that the average number of friends she sees in a day is~\(2-4\e^{-2}\,\). Now Jane has a new friend who calls on any given day with probability \(p\). Her old friends call as before, independently of the new friend. She never goes out with more than two friends in a day. Find the average number of friends she now sees in a day.

2003 Paper 3 Q12
D: 1700.0 B: 1470.9

Brief interruptions to my work occur on average every ten minutes and the number of interruptions in any given time period has a Poisson distribution. Given that an interruption has just occurred, find the probability that I will have less than \(t\) minutes to work before the next interruption. If the random variable \(T\) is the time I have to work before the next interruption, find the probability density function of \(T\,\). I need an uninterrupted half hour to finish an important paper. Show that the expected number of interruptions before my first uninterrupted period of half an hour or more is \(\e^3-1\). Find also the expected length of time between interruptions that are less than half an hour apart. Hence write down the expected wait before my first uninterrupted period of half an hour or more.

1999 Paper 3 Q14
D: 1700.0 B: 1487.9

In the basic version of Horizons (H1) the player has a maximum of \(n\) turns, where \(n \ge 1\). At each turn, she has a probability \(p\) of success, where \(0 < p < 1\). If her first success is at the \(r\)th turn, where \(1 \le r \le n\), she collects \(r\) pounds and then withdraws from the game. Otherwise, her winnings are nil. Show that in H1, her expected winnings are $$ p^{-1}\left[1+nq^{n+1}-(n+1)q^n\right]\quad\hbox{pounds}, $$ where \(q=1-p\). The rules of H2 are the same as those of H1, except that \(n\) is randomly selected from a Poisson distribution with parameter \(\lambda\). If \(n=0\) her winnings are nil. Otherwise she plays H1 with the selected \(n\). Show that in H2, her expected winnings are $$ {1 \over p}{\left(1-{\e^{-{\lambda}p}}\right)} -{{\lambda}q}{\e^{-{\lambda}p}} \quad\hbox{pounds}. $$


Solution: \begin{align*} && \E[H1] &= \sum_{r=1}^n r \cdot \mathbb{P}(\text{first success on }r\text{th turn}) \\ &&&= \sum_{r=1}^n r \cdot q^{r-1}p \\ &&&= p\sum_{r=1}^n r q^{r-1} \\ \\ && \frac{1-x^{n+1}}{1-x} &= \sum_{r=0}^n x^r \\ \Rightarrow && \sum_{r=1}^n r x^{r-1} &= \frac{-(n+1)x^n(1-x) +(1-x^{n+1})}{(1-x)^2} \\ &&&= \frac{1-(n+1)x^n+nx^{n+1}}{(1-x)^2} \\ \\ && \E[H1] &= p\sum_{r=1}^n r q^{r-1} \\ &&&= p\frac{1-(n+1)q^n+nq^{n+1}}{(1-q)^2} \\ &&&= p^{-1}(1-(n+1)q^{n} + nq^{n+1}) \end{align*} Not that if \(n =0\) , the formula for \(\E[H1] = 0\). So \begin{align*} && \E[H2] &= \E[\E[H1|n=N]] \\ &&&= p^{-1}\E \left [ 1-(N+1)q^{N} + Nq^{N+1}\right] \\ &&&= p^{-1}\E \left [ 1-((1-q)N+1)q^{N} \right] \\ &&&= p^{-1}\left (1 - p\E[Nq^N] - G_{Po(\lambda)}(q) \right) \\ &&&= p^{-1}(1-e^{-\lambda(1-q)}) - \E[Nq^N] \\ &&&= p^{-1}(1-e^{-\lambda(1-q)}) - q\lambda e^{-\lambda(1-q)} \\ &&&= p^{-1}(1-e^{-\lambda p}) - q\lambda e^{-\lambda p} \end{align*}

1989 Paper 2 Q15
D: 1600.0 B: 1484.0

Two points are chosen independently at random on the perimeter (including the diameter) of a semicircle of unit radius. What is the probability that exactly one of them lies on the diameter? Let the area of the triangle formed by the two points and the midpoint of the diameter be denoted by the random variable \(A\).

  1. Given that exactly one point lies on the diameter, show that the expected value of \(A\) is \(\left(2\pi\right)^{-1}\).
  2. Given that neither point lies on the diameter, show that the expected value of \(A\) is \(\pi^{-1}\). [You may assume that if two points are chosen at random on a line of length \(\pi\) units, the probability density function for the distance \(X\) between the two points is \(2\left(\pi-x\right)/\pi^{2}\) for \(0\leqslant x\leqslant\pi.\)]
Using these results, or otherwise, show that the expected value of \(A\) is \(\left(2+\pi\right)^{-1}\).


Solution:

  1. TikZ diagram
    \begin{align*} \mathbb{E}(A \mid \text{exactly one point on diameter}) &= \int_{-1}^1\int_0^\pi \frac12 (x-0)\cdot 1 \cdot \sin(\pi - \theta) \frac{1}{\pi} \d \theta \frac{1}{2} \d x \\ &= \int_{-1}^1\frac1{2\pi} x \d x \cdot \left [ -\cos \theta \right]_0^\pi \\ &= \frac{1}{2\pi} \end{align*}
  2. TikZ diagram
    \begin{align*} \mathbb{E}(A \mid \text{no point on diameter}) &= \int_0^{\pi} \frac12 \cdot 1 \cdot 1 \cdot \sin x \cdot 2(\pi - x)/\pi^2 \d x \\ &= \frac1{\pi^2} \int_0^\pi \sin x (\pi - x) \d x \\ &= \frac1{\pi^2} \int_0^\pi x\sin x \d x \\ &= \frac1{\pi^2} \left [ \sin x - x \cos x \right]_0^{\pi} \\ &= \frac{1}{\pi} \end{align*}
If both points lie on the diameter the area of the triangle is \(0\). Therefore: \begin{align*} \mathbb{E}(A) &= \frac{1}{2\pi} \mathbb{P}(\text{exactly one point on diameter}) + \frac{1}{\pi}\mathbb{P}(\text{no points on diameter}) \\ &= \frac1{2\pi} \cdot \left (2 \cdot \frac{2}{2+\pi} \cdot \frac{\pi}{2+\pi} \right) + \frac{1}{\pi} \cdot \left ( \frac{\pi}{2+\pi} \cdot \frac{\pi}{2+\pi}\right) \\ &= \frac{1}{\pi} \frac{2\pi + \pi^2}{(2+\pi)^2} \\ &= \frac{1}{2+\pi} \end{align*}