17 problems found
Solution:
Each of the independent random variables \(X_1, X_2, \ldots, X_n\) has the probability density function \(\mathrm{f}(x) = \frac{1}{2}\sin x\) for \(0 \leqslant x \leqslant \pi\) (and zero otherwise). Let \(Y\) be the random variable whose value is the maximum of the values of \(X_1, X_2, \ldots, X_n\).
The continuous random variable \(X\) has probability density function \[ f(x) = \begin{cases} \lambda e^{-\lambda x} & \text{for } x \geqslant 0, \\ 0 & \text{otherwise,} \end{cases} \] where \(\lambda\) is a positive constant. The random variable \(Y\) is the greatest integer less than or equal to \(X\), and \(Z = X - Y\).
Solution:
Each of the two independent random variables \(X\) and \(Y\) is uniformly distributed on the interval~\([0,1]\).
Solution:
The lifetime of a fly (measured in hours) is given by the continuous random variable \(T\) with probability density function \(f(t)\) and cumulative distribution function \(F(t)\). The hazard function, \(h(t)\), is defined, for \(F(t) < 1\), by \[ h(t) = \frac{f(t)}{1-F(t)}\,. \]
Solution:
The random variable \(X\) has probability density function \(f(x)\) (which you may assume is differentiable) and cumulative distribution function \(F(x)\) where \(-\infty < x < \infty \). The random variable \(Y\) is defined by \(Y= \e^X\). You may assume throughout this question that \(X\) and \(Y\) have unique modes.
Solution:
Solution:
Solution:
The probability density function \(\f(x)\) of the random variable \(X\) is given by $$\f(x) = k\left[{\phi}(x) + {\lambda}\g(x)\right]$$ where \({\phi}(x)\) is the probability density function of a normal variate with mean 0 and variance 1, \(\lambda \) is a positive constant, and \(\g(x)\) is a probability density function defined by \[ \g(x)= \begin{cases} 1/\lambda & \mbox{for \(0 \le x \le {\lambda}\)}\,;\\ 0& \mbox{otherwise} . \end{cases} \] Find \(\mu\), the mean of \(X\), in terms of \(\lambda\), and prove that \(\sigma\), the standard deviation of \(X\), satisfies. $$\sigma^2 = \frac{\lambda^4 +4{\lambda}^3+12{\lambda}+12} {12(1 + \lambda )^2}\;.$$ In the case \(\lambda=2\):
Solution: \begin{align*} && 1 &= \int_{-\infty}^{\infty} f(x) \d x \\ &&&= k[1 + \lambda] \\ \Rightarrow && k &= \frac{1}{1+\lambda} \\ \\ && \mu &= \int_{-\infty}^\infty x f(x) \d x \\ &&&= k \int_{-\infty}^\infty x \phi(x) \d x + k \lambda \int_{-\infty}^{\infty} x g(x) \d x \\ &&&= k \cdot 0 + k \lambda \cdot \frac{\lambda}{2} \\ &&&= \frac{\lambda^2}{2(1+\lambda)} \\ \\ && \E[X^2] &= \int_{-\infty}^\infty x^2 f(x) \d x \\ &&&= k \int_{-\infty}^\infty x^2 \phi(x) \d x + k \lambda \int_{-\infty}^{\infty} x^2 g(x) \d x \\ &&&= k \cdot 1 + k \lambda \int_0^{\lambda} \frac{x^2}{\lambda} \d \lambda \\ &&&= k + \frac{k \lambda^3}{3} \\ &&&= \frac{3+\lambda^3}{3(1+\lambda)} \\ && \var[X] &= \frac{3+\lambda^3}{3(1+\lambda)} - \frac{\lambda^4}{4(1+\lambda)^2} \\ &&& = \frac{(3+\lambda^3)4(1+\lambda) - 3\lambda^4}{12(1+\lambda)^2} \\ &&&= \frac{\lambda^4+4\lambda^3+12\lambda + 12}{12(1+\lambda)^2} \end{align*}
Sketch the graph, for \(x \ge 0\,\), of $$ y = kx\e^{-ax^2} \;, $$ where \(a\) and \(k\) are positive constants. The random variable \(X\) has probability density function \(\f(x)\) given by \begin{equation*} \f(x)= \begin{cases} kx\e^{-ax^2} & \text{for \(0 \le x \le 1\)}\\[3pt] 0 & \text{otherwise}. \end{cases} \end{equation*} Show that \(\displaystyle k=\frac{2a}{1-\e^{-a}}\) and find the mode \(m\) in terms of \(a\,\), distinguishing between the cases \(a < \frac12\) and \(a > \frac12\,\). Find the median \(h\) in terms of \(a\), and show that \(h > m\) if \(a > -\ln\left(2\e^{-1/2} - 1\right).\) Show that, \(-\ln\left(2\e^{-1/2}-1\right)> \frac12 \,\). Show also that, if \(a > -\ln\left(2\e^{-1/2} - 1\right) \,\), then $$ P(X > m \;\vert\; X < h) = {{2\e^{-1/2}-\e^{-a}-1} \over 1-\e^{-a}}\;. $$
Solution:
On \(K\) consecutive days each of \(L\) identical coins is thrown \(M\) times. For each coin, the probability of throwing a head in any one throw is \(p\) (where \(0 < p < 1\)). Show that the probability that on exactly \(k\) of these days more than \(l\) of the coins will each produce fewer than \(m\) heads can be approximated by \[ {K \choose k}q^k(1-q)^{K-k}, \] where \[ q=\Phi\left( \frac{2h-2l-1}{2\sqrt{h} }\right), \ \ \ \ \ \ h=L\Phi\left( \frac{2m-1-2Mp}{2\sqrt{ Mp(1-p)}}\right) \] and \(\Phi(\cdot)\) is the cumulative distribution function of a standard normal variate. Would you expect this approximation to be accurate in the case \(K=7\), \(k=2\), \(L=500\), \(l=4\), \(M=100\), \(m=48\) and \(p=0.6\;\)?
Solution: Let \(H_i\) be the random variable of how many heads the \(i\)th coin throws on a given day. Then \(H_i \sim B(M,p)\), and the probability that a given coin produces fewer than \(m\) heads is \(p_h = \P(H_i < m)\) Let \(C\) be the random variable the number of coins producing fewer than \(m\) heads, then \(C \sim B(L, p_h)\). The probability that more than \(l\) of the coins produce fewer than \(m\) heads is therefore \(\P(C > l)\). Finally, the probability that on exactly \(k\) days more than \(l\) of the coins will produce fewer than \(m\) heads is: \[ \binom{K}{k} \cdot \P(C > l)^k \cdot (1-\P(C > l))^{K-k} \] Let's start by assuming that all our Binomials can be approximated by a normal distribution. \(B(M,p) \approx N(Mp, Mp(1-p))\) and so: \begin{align*} p_h &= \P(H_i < m) \\ &\approx \P( \sqrt{Mp(1-p)}Z+Mp < m-\frac12) \\ &= \P \l Z < \frac{2m-2Mp-1}{2\sqrt{Mp(1-p)}} \r \\ &= \Phi\l\frac{2m-2Mp-1}{2\sqrt{Mp(1-p)}} \r \end{align*} \(B(L, p_h) \approx B \l L, \P \l Z < \frac{2m-2Mp-1}{2\sqrt{Mp(1-p)}} \r\r = B(L, \frac{h}{L}) \approx N(h, \frac{h(L-h)}{L})\) Therefore \begin{align*} \P(C > l) &= 1-\P(C \leq l) \\ &\approx 1- \P \l \sqrt{\frac{h(L-h)}{L}} Z + h \leq l+\frac12 \r \\ &= 1 - \P \l Z \leq \frac{2l-2h+1}{2\sqrt{\frac{h(L-h)}{L}}}\r \\ &= 1- \Phi\l \frac{2l-2h+1}{2\sqrt{\frac{h(L-h)}{L}}} \r \\ &= \Phi\l \frac{2h-2l-1}{2\sqrt{\frac{h(L-h)}{L}}} \r \end{align*} If we can approximate \(\sqrt{1-\frac{h}{L}}\) by \(1\) then we obtain the approximation in the question. Alternatively, \(B(L, \frac{h}{L}) \approx Po(h)\) and \(Po(h) \approx N(h,h)\) so we obtain: \begin{align*} \P(C > l) &= 1-\P(C \leq l) \\ &\approx 1 - \P(\sqrt{h} Z +h < l + \frac12) \\ &= 1 - \P \l Z < \frac{2l-2h+1}{2\sqrt{h}} \r \\ &= \Phi \l \frac{2h - 2l -1}{2\sqrt{h}}\r \end{align*} as required. [I think this is what the examiners expected]. Considering the case \(K=7\), \(k=2\), \(L=500\), \(l=4\), \(M=100\), \(m=48\) and \(p=0.6\), we have the first normal approximation depends on \(Mp\) and \(M(1-p)\) being large. They are \(60\) and \(40\) respectively, so this is likely a good approximation. The first approximation finds that \begin{align*} h &= 500 \cdot \Phi \l \frac{2 \cdot 48 - 2 \cdot 60 - 1}{2\sqrt{24}} \r \\ &= 500 \cdot \Phi \l \frac{2 \cdot 48 - 2 \cdot 60 - 1}{2\sqrt{24}} \r \\ &= 500 \cdot \Phi \l \frac{-25}{2 \sqrt{24}} \r \\ &\approx 500 \cdot \Phi (-2.5) \\ &= 500 \cdot 0.0062 \\ &\approx 3.1 \end{align*} The second binomial approximation will be good if \(500 \cdot \frac{3.1}{500} = 3.1\) is large, but this is quite small. Therefore, we shouldn't expect this to be a good approximation. However, since \(m = 48\) is far from the mean (in a normalised sense), we might expect the percentage error to be large. [Alternatively, using what I expect the desired approach] The approximation of \(B(L, \frac{h}{L}) \approx Po(h)\) is acceptable since \(n>50\) and \(h < 5\). The approximation of \(Po(h) \sim N(h,h)\) is not acceptable since \(h\) is small (in particular \(h < 15\)) Finally, we can compute all these values exactly using a modern calculator. \begin{array}{l|cc} & \text{correct} & \text{approx} \\ \hline p_h & 0.005760\ldots & 0.005362\ldots \\ \P(C > l) & 0.164522\ldots & 0.133319\ldots \\ \text{ans} & 0.231389\ldots & 0.182516\ldots \end{array} We can also see how the errors propagate, by doing the calculations assuming the previous steps are correct, and also including the Poisson step. \begin{array}{lccc} & \text{correct} & \text{approx} & \text{using approx } p_h \\ \hline p_h & 0.005760\ldots & 0.005362\ldots & - \\ \P(C > l)\quad [Po(h)] & 0.164522\ldots & 0.165044\ldots & 0.134293\ldots \\ \P(C > l)\quad [N(h,h)] & 0.164522\ldots & 0.169953\ldots & 0.133319\ldots \\ \P(C > l)\quad [N(h,h(1-\frac{h}{L})] & 0.164522\ldots & 0.169255\ldots & 0.132677\ldots \\ \text{ans} & 0.231389\ldots & 0.231389\ldots \end{array} By doing this, we discover that the largest errors are actually coming not from approximating the second approximation but from the small absolute (but large relative error) in the first approximation. This is, in fact, a coincidence; we can observe it by investigating the specific values being used. The first approximation looks as follows:
Let \(\F(x)\) be the cumulative distribution function of a random variable \(X\), which satisfies \(\F(a)=0\) and \(\F(b)=1\), where \(a>0\). Let \[ \G(y) = \frac{\F(y)}{2-\F(y)}\;. \] Show that \(\G(a)=0\,\), \(\G(b)=1\,\) and that \(\G'(y)\ge0\,\). Show also that \[ \frac12 \le \frac2{(2-\F(y))^2} \le 2\;. \] The random variable \(Y\) has cumulative distribution function \(\G(y)\,\). Show that \[ { \tfrac12} \,\E(X) \le \E(Y) \le 2 \E(X) \;, \] and that \[ \var(Y) \le 2\var(X) +\tfrac 74 \big(\E(X)\big)^2\;. \]
Solution: \begin{align*} && G(a) &= \frac{F(a)}{2-F(a)}\\ &&&= 0 \tag{\(F(a)= 0\)}\\ \\ && G(b) &= \frac{F(b)}{2-F(b)} \\ &&&= \frac{1}{2-1} = 1 \tag{\(F(b)=1\)}\\ \\ && G'(y) &= \frac{F'(y)(2-F(y))+F(y)F'(y)}{(2-F(y))^2} \\ &&&= \frac{2F'(y)}{(2-F(y))^2} \geq 0 \tag{\(F'(y) \geq 0\)} \end{align*} \begin{align*} && 0 \leq F(y)\leq1\\ \Leftrightarrow&& 1\leq 2-F(y) \leq 2\\ \Leftrightarrow &&1 \leq (2-F(y))^2 \leq 4\\ \Leftrightarrow && 1 \geq \frac{1}{(2-F(y))^2} \geq \frac14 \\ \Leftrightarrow && 2 \geq \frac{2}{(2-F(y))^2} \geq\frac12 \end{align*} \begin{align*} && \mathbb{E}(Y) &= \int_a^b y G'(y) \d y \\ &&&= \int_a^b y F'(y) \underbrace{\frac{2}{(2-F(y))^2}}_{\in [\frac12, 2]} \d y \\ &&&\leq 2 \E[X] \\ &&&\geq \frac12 \E[X]\\ \\ && \E[Y^2] &\leq 2\E[X^2] \\ && \E[Y^2] &\geq \frac12\E[X^2] \\ \\ \Rightarrow && \var[Y] &= \E[Y^2]-\E[Y]^2 \\ &&& \leq 2 \E[X^2] - (\tfrac12\E[X])^2 \\ &&&= 2 \var[X] + \tfrac74(\E[X])^2 \end{align*}
A needle of length two cm is dropped at random onto a large piece of paper ruled with parallel lines two cm apart.
Solution:
A goat \(G\) lies in a square field \(OABC\) of side \(a\). It wanders randomly round its field, so that at any time the probability of its being in any given region is proportional to the area of this region. Write down the probability that its distance, \(R\), from \(O\) is less than \(r\) if \(0 < r\leqslant a,\) and show that if \(r\geqslant a\) the probability is \[ \left(\frac{r^{2}}{a^{2}}-1\right)^{\frac{1}{2}}+\frac{\pi r^{2}}{4a^{2}}-\frac{r^{2}}{a^{2}}\cos^{-1}\left(\frac{a}{r}\right). \] Find the median of \(R\) and probability density function of \(R\). The goat is then tethered to the corner \(O\) by a chain of length \(a\). Find the conditional probability that its distance from the fence \(OC\) is more than \(a/2\).