11 problems found
Solution:
Let \(X\) be a random variable with a Laplace distribution, so that its probability density function is given by \[ \f(x) = \frac12 \e^{-\vert x \vert }\;, \text{ \(-\infty < x < \infty \)}. \tag{\(*\)} \] Sketch \(\f(x)\). Show that its moment generating function \({\rm M}_X(\theta)\) is given by \({\rm M}_X(\theta)= (1-\theta^2)^{-1}\) and hence find the variance of \(X\). A frog is jumping up and down, attempting to land on the same spot each time. In fact, in each of \(n\) successive jumps he always lands on a fixed straight line but when he lands from the \(i\)th jump (\(i=1\,,2\,,\ldots\,,n\)) his displacement from the point from which he jumped is \(X_i\,\)cm, where \(X_i\) has the distribution \((*)\). His displacement from his starting point after \(n\) jumps is \(Y\,\)cm (so that \(Y=\sum\limits_{i=1}^n X_i\)). Each jump is independent of the others. Obtain the moment generating function for \(Y/ \sqrt {2n}\) and, by considering its logarithm, show that this moment generating function tends to \(\exp(\frac12\theta^2)\) as \(n\to\infty\). Given that \(\exp(\frac12\theta^2)\) is the moment generating function of the standard Normal random variable, estimate the least number of jumps such that there is a \(5\%\) chance that the frog lands 25 cm or more from his starting point.
Solution:
Two coins \(A\) and \(B\) are tossed together. \(A\) has probability \(p\) of showing a head, and \(B\) has probability \(2p\), independent of \(A\), of showing a head, where \(0 < p < \frac12\). The random variable \(X\) takes the value 1 if \(A\) shows a head and it takes the value \(0\) if \(A\) shows a tail. The random variable \(Y\) takes the value 1 if \(B\) shows a head and it takes the value \(0\) if \(B\) shows a tail. The random variable \(T\) is defined by \[ T= \lambda X + {\textstyle\frac12} (1-\lambda)Y. \] Show that \(\E(T)=p\) and find an expression for \(\var(T)\) in terms of \(p\) and \(\lambda\). Show that as \(\lambda\) varies, the minimum of \(\var(T)\) occurs when \[ \lambda =\frac{1-2p}{3-4p}\;. \] The two coins are tossed \(n\) times, where \(n>30\), and \(\overline{T}\) is the mean value of \(T\). Let \(b\) be a fixed positive number. Show that the maximum value of \(\P\big(\vert \overline{T}-p\vert < b\big)\) as \(\lambda\) varies is approximately \(2\Phi(b/s)-1\), where \(\Phi\) is the cumulative distribution function of a standard normal variate and \[ s^2= \frac{p(1-p)(1-2p)}{(3-4p)n}\;. \]
Solution: \begin{align*} && \E[T] &= \E[\lambda X + \tfrac12(1-\lambda)Y] \\ &&&= \lambda \E[X] + \tfrac12(1-\lambda) \E[Y] \\ &&&= \lambda p + \tfrac12 (1-\lambda) 2p \\ &&&= p \\ \\ && \var[T] &= \var[\lambda X + \tfrac12(1-\lambda)Y] \\ &&&= \lambda^2 \var[X] + \tfrac14(1-\lambda)^2 \var[Y] \\ &&&= \lambda^2 p(1-p) + \tfrac14(1-\lambda)^22p(1-2p) \\ &&&= p(\lambda^2 + \tfrac12(1-\lambda)^2) -p^2(\lambda^2+(1-\lambda)^2)\\ &&&= p(\tfrac32\lambda^2 - \lambda + \tfrac12) -p^2(2\lambda^2 -2\lambda + 2) \end{align*} Differentiating \(\var[T]\) with respect to \(\lambda\), and noting it is a quadratic with positive leading coefficient, we get \begin{align*} && \frac{\d \var[T]}{\d \lambda} &= p(2\lambda -(1-\lambda)) - p^2(2 \lambda -2(1-\lambda)) \\ &&&= p(3\lambda - 1)-p^2(4\lambda - 2) \\ \Rightarrow && \lambda(4p-3) &= 2p-1 \\ \Rightarrow && \lambda &= \frac{1-2p}{3-4p} \end{align*} By the central limit theorem \(\overline{T} \sim N(p, \frac{\sigma^2}{n})\) in particular, \(\mathbb{P}(|\overline{T} - p| < b) = \mathbb{P}(\left \lvert |\frac{\overline{T}-p}{\frac{\sigma}{\sqrt{n}}} \right \lvert < \frac{b}{\frac{\sigma}{\sqrt{n}}}) = \mathbb{P}(|Z| < \frac{b\sqrt{n}}{\sigma}) = 2\Phi(b/s) - 1\) where \(s = \frac{\sigma}{\sqrt{n}}\) so \begin{align*} && s^2 &= \frac1n \sigma^2 \\ &&&= \frac1n \left ( \left (\left ( \frac{1-2p}{3-4p} \right)^2 + \tfrac12 \left (1-\frac{1-2p}{3-4p} \right)^2 \right)p - \left ( \left ( \frac{1-2p}{3-4p} \right)^2 + \left (1-\frac{1-2p}{3-4p} \right)^2\right)p^2 \right) \\ &&&= \frac1n \left ( \left (\left ( \frac{1-2p}{3-4p} \right)^2 + \tfrac12 \left (\frac{2-2p}{3-4p} \right)^2 \right)p - \left ( \left ( \frac{1-2p}{3-4p} \right)^2 + \left (\frac{2-2p}{3-4p} \right)^2\right)p^2 \right) \\ &&&= \frac{p}{n(3-4p)^2} \left ( (1 -4p + 4p^2 + 2-4p+2p^2) - (1-4p+4p^2+4-8p+4p^2)p \right) \\ &&&= \frac{p}{n(3-4p)^2} \left (3-13p+18p^2-8p^3 \right) \\ &&&= \frac{p}{n(3-4p)^2} (3-4p)(1-2p)(1-p) \\ &&&= \frac{p(1-p)(1-2p)}{(3-4p)n} \end{align*}
A random variable \(X\) is distributed uniformly on \([\, 0\, , \, a\,]\). Show that the variance of \(X\) is \({1 \over 12} a^2\). A sample, \(X_1\) and \(X_2\), of two independent values of the random variable is drawn, and the variance \(V\) of the sample is determined. Show that \(V = {1 \over 4} \l X_1 -X_2 \r ^2\), and hence prove that \(2 V\) is an unbiased estimator of the variance of X. Find an exact expression for the probability that the value of \(V\) is less than \({1 \over 12} a^2\) and estimate the value of this probability correct to one significant figure.
Solution: \begin{align*} && \E[X] &= \frac{a}{2}\tag{by symmetry} \\ &&\E[X^2] &= \int_0^a \frac{1}{a} x^2 \d x \\ &&&= \frac{a^3}{3a} = \frac{a^2}{3} \\ \Rightarrow && \var[X] &= \frac{a^2}{3} - \frac{a^2}{4} = \frac{a^2}{12} \\ \end{align*} \begin{align*} && V &=\frac{1}{2} \left ( \left ( X_1 - \frac{X_1+X_2}{2} \right )^2+\left ( X_2- \frac{X_1+X_2}{2} \right )^2 \right ) \\ &&&= \frac{1}{8} ((X_1 - X_2)^2 + (X_2 - X_1)^2 ) \\ &&&= \frac14 (X_1-X_2)^2 \\ \\ && \E[2V] &= \E \left [ \frac12 (X_1 - X_2)^2 \right] \\ &&&= \frac12 \E[X_1^2] - \E[X_1X_2] + \frac12 \E[X_2^2] \\ &&&= \frac{a^2}{3} - \frac{a^2}{4} = \frac{a^2}{12} \end{align*} Therefore \(2V\) is an unbiased estimator of the variance of \(X\).
The random variables \(X_1\), \(X_2\), \(\ldots\) , \(X_{2n+1}\) are independently and uniformly distributed on the interval \(0 \le x \le 1\). The random variable \(Y\) is defined to be the median of \(X_1\), \(X_2\), \(\ldots\) , \(X_{2n+1}\). Given that the probability density function of \(Y\) is \(\g(y)\), where \[ \mathrm{g}(y)=\begin{cases} ky^{n}(1-y)^{n} & \mbox{ if }0\leqslant y\leqslant1\\ 0 & \mbox{ otherwise} \end{cases} \] use the result $$ \int_0^1 {y^{r}}{{(1-y)}^{s}}\,\d y = \frac{r!s!}{(r+s+1)!} $$ to show that \(k={(2n+1)!}/{{(n!)}^2}\), and evaluate \(\E(Y)\) and \({\rm Var}\,(Y)\). Hence show that, for any given positive number \(d\), the inequality $$ {\P\left({\vert {Y - 1/2} \vert} < {d/{\sqrt {n}}} \right)} < {\P\left({\vert {{\bar X} - 1/2} \vert} < {d/{\sqrt {n}}} \right)} $$ holds provided \(n\) is large enough, where \({\bar X}\) is the mean of \(X_1\), \(X_2\), \(\ldots\) , \(X_{2n+1}\). [You may assume that \(Y\) and \(\bar X\) are normally distributed for large \(n\).]
An experiment produces a random number \(T\) uniformly distributed on \([0,1]\). Let \(X\) be the larger root of the equation \[x^{2}+2x+T=0.\] What is the probability that \(X>-1/3\)? Find \(\mathbb{E}(X)\) and show that \(\mathrm{Var}(X)=1/18\). The experiment is repeated independently 800 times generating the larger roots \(X_{1}, X_{2}, \dots, X_{800}\). If \[Y=X_{1}+X_{2}+\dots+X_{800}.\] find an approximate value for \(K\) such that \[\mathrm{P}(Y\leqslant K)=0.08.\]
Solution: \((x+1)^2+T-1 = 0\) so the larger root is \(-1 + \sqrt{1-T}\) \begin{align*} && \mathbb{P}(X > -1/3) &= \mathbb{P}(-1 + \sqrt{1-T} > -1/3) \\ &&&= \mathbb{P}(\sqrt{1-T} > 2/3)\\ &&&= \mathbb{P}(1-T > 4/9)\\ &&&= \mathbb{P}\left (T < \frac59 \right) = \frac59 \end{align*} Similarly, for \(t \in [-1,0]\) \begin{align*} && \mathbb{P}(X \leq t) &= \mathbb{P}(-1 + \sqrt{1-T} \leq t) \\ &&&= \mathbb{P}(\sqrt{1-T} \leq t+1)\\ &&&= \mathbb{P}(1-T \leq (t+1)^2)\\ &&&= \mathbb{P}\left (T \geq 1-(t+1)^2\right) = (t+1)^2 \\ \Rightarrow && f_X(t) &= 2(t+1) \\ \Rightarrow && \E[X] &= \int_{-1}^0 x \cdot f_X(x) \d x \\ &&&= \int_{-1}^0 x2(x+1) \d x \\ &&&= \left [\frac23x^3+x^2 \right]_{-1}^0 \\ &&&= -\frac13 \\ && \E[X^2] &= \int_{-1}^0 x^2 \cdot f_X(x) \d x \\ &&&= \int_{-1}^0 2x^2(x+1) \d x \\ &&&= \left [ \frac12 x^4 + \frac23x^3\right]_{-1}^0 \\ &&&= \frac16 \\ \Rightarrow && \var[X] &= \E[X^2] - \left (\E[X] \right)^2 \\ &&&= \frac16 - \frac19 = \frac1{18} \end{align*} Notice that by the central limit theorem \(\frac{Y}{800} \approx N( -\tfrac13, \frac{1}{18 \cdot 800})\). Also notice that \(\Phi^{-1}(0.08) \approx -1.4 \approx -\sqrt{2}\) Therefore we are looking for roughly \(800 \cdot (-\frac13 -\frac{1}{\sqrt{18 \cdot 800}} \sqrt{2})) = -267-9 = -276\)
The random variable \(X\) is uniformly distributed on \([0,1]\). A new random variable \(Y\) is defined by the rule \[ Y=\begin{cases} 1/4 & \mbox{ if }X\leqslant1/4,\\ X & \mbox{ if }1/4\leqslant X\leqslant3/4\\ 3/4 & \mbox{ if }X\geqslant3/4. \end{cases} \] Find \({\mathrm E}(Y^{n})\) for all integers \(n\geqslant 1\). Show that \({\mathrm E}(Y)={\mathrm E}(X)\) and that \[{\mathrm E}(X^{2})-{\mathrm E}(Y^{2})=\frac{1}{24}.\] By using the fact that \(4^{n}=(3+1)^{n}\), or otherwise, show that \({\mathrm E}(X^{n}) > {\mathrm E}(Y^{n})\) for \(n\geqslant 2\). Suppose that \(Y_{1}\), \(Y_{2}\), \dots are independent random variables each having the same distribution as \(Y\). Find, to a good approximation, \(K\) such that \[{\rm P}(Y_{1}+Y_{2}+\cdots+Y_{240000} < K)=3/4.\]
Solution: \begin{align*} && \E[Y^n] &= \frac14 \cdot \frac1{4^n} + \frac14 \cdot \frac{3^n}{4^n} + \frac12 \int_{1/4}^{3/4}2 y^n \d y \\ &&&= \frac{3^n+1}{4^{n+1}} + \left [ \frac{y^{n+1}}{n+1} \right]_{1/4}^{3/4} \\ &&&= \frac{3^n+1}{4^{n+1}} + \frac{3^{n+1}-1}{(n+1)4^{n+1}} \end{align*} \begin{align*} && \E[Y] &= \frac{3+1}{16} + \frac{9-1}{2 \cdot 16} \\ &&&= \frac{1}{4} + \frac{1}{4} = \frac12 = \E[X] \end{align*} \begin{align*} && \E[X^2] &= \int_0^1 x^2 \d x = \frac13 \\ && \E[Y^2] &= \frac{9+1}{64} + \frac{27-1}{3 \cdot 64} = \frac{56}{3 \cdot 64} = \frac{7}{24} \\ \Rightarrow && \E[X^2] - \E[Y^2] &= \frac13 - \frac{7}{24} = \frac{1}{24} \end{align*} \begin{align*} && \E[X^n] &= \frac{1}{n+1} \\ && \E[Y^n] &= \frac{1}{n+1} \frac{1}{4^{n+1}}\left ( (n+1)(3^n+1)+3^{n+1}-1 \right) \\ &&&= \frac{1}{n+1} \frac{1}{4^{n+1}}\left ( 3^{n+1} + (n+1)3^n +n \right) \\ \\ && (3+1)^{n+1} &= 3^{n+1} + (n+1)3^n + \cdots + (n+1) \cdot 3 + 1 \\ &&&> 3^{n+1} + (n+1)3^n + n + 1 \end{align*} if \(n \geq 2\) Notice that by the central limit theorem: \begin{align*} &&\frac{1}{240\,000} \sum_{i=1}^{240\,000} Y_i &\sim N \left ( \frac12, \frac{1}{24 \cdot 240\,000}\right) \\ \Rightarrow && \mathbb{P}\left (\frac{\frac{1}{240\,000} \sum_{i=1}^{240\,000} Y_i - \frac12}{\frac1{24} \frac{1}{100}} \leq \frac23 \right) &\approx 0.75 \\ \Rightarrow && \mathbb{P} \left ( \sum_i Y_i \leq 240\,000 \cdot \left ( \frac2{3} \frac1{2400}+\frac12 \right) \right ) & \approx 0.75 \\ \Rightarrow && K &= 120\,000 + 66 \\ &&&\approx 120\,066 \end{align*}
Suppose \(X\) is a random variable with probability density \[ \mathrm{f}(x)=Ax^{2}\exp(-x^{2}/2) \] for \(-\infty < x < \infty.\) Find \(A\). You belong to a group of scientists who believe that the outcome of a certain experiment is a random variable with the probability density just given, while other scientists believe that the probability density is the same except with different mean (i.e. the probability density is \(\mathrm{f}(x-\mu)\) with \(\mu\neq0\)). In each of the following two cases decide whether the result given would shake your faith in your hypothesis, and justify your answer.
Solution: Let \(Z \sim N(0,1)\), with a pdf of \(f(x) = \frac{1}{\sqrt{2\pi}} \exp(-x^2/2)\) \begin{align*} && 1 &= \int_{-\infty}^\infty Ax^2 \exp(-x^2/2) \d x \\ &&&= A\sqrt{2\pi} \int_{-\infty}^\infty x^2 \frac{1}{\sqrt{2\pi}} \exp(-x^2/2) \d x \\ &&&= A\sqrt{2\pi} \E[Z^2] = A\sqrt{2\pi} \\ \Rightarrow && A &= \frac{1}{\sqrt{2\pi}} \end{align*}
Two computers, LEP and VOZ are programmed to add numbers after first approximating each number by an integer. LEP approximates the numbers by rounding: that is, it replaces each number by the nearest integer. VOZ approximates by truncation: that is, it replaces each number by the largest integer less than or equal to the number. The fractional parts of the numbers to be added are uniformly and independently distributed. (The fractional part of a number \(a\) is \(a-\left\lfloor a\right\rfloor ,\) where \(\left\lfloor a\right\rfloor \) is the largest integer less than or equal to \(a\).) Both computers approximate and add 1500 numbers. For each computer, find the probability that the magnitude of error in the answer will exceed 15. How many additions can LEP perform before the probability that the magnitude of error is less than 10 drops below 0.9?
The average number of pedestrians killed annually in road accidents in Poldavia during the period 1974-1989 was 1080 and the average number killed annually in commercial flight accidents during the same period was 180. Discuss the following newspaper headlines which appeared in 1991. (The percentage figures in square brackets give a rough indication of the weight of marks attached to each discussion.)
Solution:
A fair coin is thrown \(n\) times. On each throw, 1 point is scored for a head and 1 point is lost for a tail. Let \(S_{n}\) be the points total for the series of \(n\) throws, i.e. \(S_{n}=X_{1}+X_{2}+\cdots+X_{n},\) where \[ X_{j}=\begin{cases} 1 & \text{ if the }j \text{ th throw is a head}\\ -1 & \text{ if the }j\text{ th throw is a tail.} \end{cases} \]
Solution: Notice that \(\mathbb{E}(X_i) = 0, \mathbb{E}(X_i^2) = 1\) and so \(\mathbb{E}(S_n) =0, \textrm{Var}(S_n) = n\).