43 problems found
In this question, you may assume that \(\displaystyle \int_0^\infty \!\!\! \e^{-x^2/2} \d x = \sqrt{\tfrac12 \pi}\,\). The number of supermarkets situated in any given region can be modelled by a Poisson random variable, where the mean is \(k\) times the area of the given region. Find the probability that there are no supermarkets within a circle of radius \(y\). The random variable \(Y\) denotes the distance between a randomly chosen point in the region and the nearest supermarket. Write down \(\P(Y < y)\) and hence show that the probability density function of \(Y\) is \(\displaystyle 2\pi y k \e^{-\pi k y^2}\) for \(y\ge0\). Find \(\E(Y)\) and show that \(\var(Y) = \dfrac{4-\pi}{4\pi k}\).
Solution: A circle radius \(y\) has a number of supermarkets \(X\) where \(X \sim Po(k \pi y^2)\). \[ \mathbb{P}(X = 0) = e^{-k\pi y^2} \frac{1}{0!} = e^{-k\pi y^2} \] The probability \(\mathbb{P}(Y < y) = 1-\mathbb{P}(Y \geq y) = 1-e^{-k\pi y^2}\), and in particular \(f_Y(y) = 2k\pi y e^{-k\pi y^2}\) (by differentiating). \begin{align*} && \mathbb{E}(Y) &= \int_0^\infty yf_Y(y) \d y \\ &&&= \int_0^\infty 2\pi y^2 k e^{-\pi k y^2} \d y \\ \sigma^2 = \frac{1}{2k\pi}:&&&= \pi k \sqrt{2 \pi}\sigma \int_{-\infty}^\infty \frac{1}{\sqrt{2 \pi} \sigma }y^2 e^{-\frac12 \cdot 2\pi k y^2} \d y \\ &&&=\pi k \sqrt{2 \pi}\sigma \mathbb{E}\left (N(0, \sigma^2)^2 \right) \\ &&&= \pi k \sqrt{2 \pi}\sigma\sigma^2 \\ &&&= \pi k \sqrt{2 \pi} \frac{1}{(2k\pi)^{3/2}} \\ &&&= \frac{1}{2\sqrt{k}} \end{align*} \begin{align*} && \mathbb{E}(Y^2) &= \int_0^\infty y^2f_Y(y) \d y \\ &&&= \int_0^\infty 2\pi y^3 k e^{-\pi k y^2} \d y \\ &&&= \int_0^{\infty}y^2 2y \pi k e^{-\pi k y^2} \d y \\ \\ &&&= \left [-y^2 e^{-\pi k y^2}\right]_0^{\infty}+\int_0^\infty 2ye^{-\pi k y^2} \d y \\ &&&= \left [-\frac{1}{\pi k}e^{-\pi k y^2} \right]_0^{\infty} \\ &&&= \frac{1}{\pi k} \\ \Rightarrow && \textrm{Var}(Y) &= \mathbb{E}(Y^2) - \left [ \mathbb{E}(Y)\right]^2 \\ &&&= \frac{1}{\pi k} - \frac{1}{4k} \\ &&&= \frac{4 - \pi}{4\pi k} \end{align*}
In this question, you may use without proof the following result: \[ \int \sqrt{4-x^2}\, \d x = 2 \arcsin (\tfrac12 x ) + \tfrac 12 x \sqrt{4-x^2} +c\,. \] A random variable \(X\) has probability density function \(\f\) given by \[ \f(x) = \begin{cases} 2k & -a\le x <0 \\[3mm] k\sqrt{4-x^2} & \phantom{-} 0\le x \le 2 \\[3mm] 0 & \phantom{-}\text{otherwise}, \end{cases} \] where \(k\) and \(a\) are positive constants.
Solution: First notice that \begin{align*} && 1 &= \int_{-a}^2 f(x) \d x \\ &&&= 2ka + k\pi \\ \Rightarrow && k &= (\pi + 2a)^{-1} \end{align*}
What property of a distribution is measured by its skewness?
Solution: Skewness is a measure of the symmetry (specifically the lack-thereof) in the distribution. How much mass is there on one side rather than another.
A continuous random variable \(X\) has probability density function given by \[ \f(x) = \begin{cases} 0 & \mbox{for } x<0 \\ k\e^{-2 x^2} & \mbox{for } 0\le x< \infty \;,\\ \end{cases} \] where \(k\) is a constant.
Solution:
Solution:
The random variable \(X\) has a continuous probability density function \(\f(x)\) given by \begin{equation*} \f(x) = \begin{cases} 0 & \text{for } x \le 1 \\ \ln x & \text{for } 1\le x \le k\\ \ln k & \text{for } k\le x \le 2k\\ a-bx & \text{for } 2k \le x \le 4k \\ 0 & \text{for } x\ge 4k \end{cases} \end{equation*} where \(k\), \(a\) and \(b\) are constants.
Solution:
Sketch the graph of \[ y= \dfrac1 { x \ln x} \text{ for \(x>0\), \(x\ne1\)}.\] You may assume that \(x\ln x \to 0\) as \(x\to 0\). The continuous random variable \(X\) has probability density function \[ \f(x) = \begin{cases} \dfrac \lambda {x\ln x}& \text{for \(a\le x \le b\)}\;, \\[3mm] \ \ \ 0 & \text{otherwise }, \end{cases} \] where \(a\), \(b\) and \(\lambda\) are suitably chosen constants.
Solution:
The probability density function \(\f(x)\) of the random variable \(X\) is given by $$\f(x) = k\left[{\phi}(x) + {\lambda}\g(x)\right]$$ where \({\phi}(x)\) is the probability density function of a normal variate with mean 0 and variance 1, \(\lambda \) is a positive constant, and \(\g(x)\) is a probability density function defined by \[ \g(x)= \begin{cases} 1/\lambda & \mbox{for \(0 \le x \le {\lambda}\)}\,;\\ 0& \mbox{otherwise} . \end{cases} \] Find \(\mu\), the mean of \(X\), in terms of \(\lambda\), and prove that \(\sigma\), the standard deviation of \(X\), satisfies. $$\sigma^2 = \frac{\lambda^4 +4{\lambda}^3+12{\lambda}+12} {12(1 + \lambda )^2}\;.$$ In the case \(\lambda=2\):
Solution: \begin{align*} && 1 &= \int_{-\infty}^{\infty} f(x) \d x \\ &&&= k[1 + \lambda] \\ \Rightarrow && k &= \frac{1}{1+\lambda} \\ \\ && \mu &= \int_{-\infty}^\infty x f(x) \d x \\ &&&= k \int_{-\infty}^\infty x \phi(x) \d x + k \lambda \int_{-\infty}^{\infty} x g(x) \d x \\ &&&= k \cdot 0 + k \lambda \cdot \frac{\lambda}{2} \\ &&&= \frac{\lambda^2}{2(1+\lambda)} \\ \\ && \E[X^2] &= \int_{-\infty}^\infty x^2 f(x) \d x \\ &&&= k \int_{-\infty}^\infty x^2 \phi(x) \d x + k \lambda \int_{-\infty}^{\infty} x^2 g(x) \d x \\ &&&= k \cdot 1 + k \lambda \int_0^{\lambda} \frac{x^2}{\lambda} \d \lambda \\ &&&= k + \frac{k \lambda^3}{3} \\ &&&= \frac{3+\lambda^3}{3(1+\lambda)} \\ && \var[X] &= \frac{3+\lambda^3}{3(1+\lambda)} - \frac{\lambda^4}{4(1+\lambda)^2} \\ &&& = \frac{(3+\lambda^3)4(1+\lambda) - 3\lambda^4}{12(1+\lambda)^2} \\ &&&= \frac{\lambda^4+4\lambda^3+12\lambda + 12}{12(1+\lambda)^2} \end{align*}
In this question, you may use the result \[ \displaystyle \int_0^\infty \frac{t^m}{(t+k)^{n+2}} \; \mathrm{d}t =\frac{m!\, (n-m)!}{(n+1)! \, k^{n-m+1}}\;, \] where \(m\) and \(n\) are positive integers with \(n\ge m\,\), and where \(k>0\,\). The random variable \(V\) has density function \[ \f(x) = \frac{C \, k^{a+1} \, x^a}{(x+k)^{2a+2}} \quad \quad (0 \le x < \infty) \;, \] where \(a\) is a positive integer. Show that \(\displaystyle C = \frac{(2a+1)!}{a! \, a!}\;\). Show, by means of a suitable substitution, that \[ \int_0^v \frac{x^a}{(x+k)^{2a+2}} \; \mathrm{d}x = \int_{\frac{k^2}{v}}^\infty \frac{u^a}{(u+k)^{2a+2}} \; \mathrm{d}u \] and deduce that the median value of \(V\) is \(k\). Find the expected value of \(V\). The random variable \(V\) represents the speed of a randomly chosen gas molecule. The time taken for such a particle to travel a fixed distance \(s\) is given by the random variable \(\ds T=\frac{s}{V}\). Show that \begin{equation} \mathbb{P}( T < t) = \ds \int_{\frac{s}{t}}^\infty \frac{C \, k^{a+1} \, x^a}{(x+k)^{2a+2}}\; \mathrm{d}x \tag{\( *\)} \end{equation} and hence find the density function of \(T\). You may find it helpful to make the substitution \(\ds u = \frac{s}{x}\) in the integral \((*)\). Hence show that the product of the median time and the median speed is equal to the distance \(s\), but that the product of the expected time and the expected speed is greater than \(s\).
Solution: \begin{align*} && f(x) &= \frac{C \, k^{a+1} \, x^a}{(x+k)^{2a+2}} \\ \Rightarrow && 1 &= \int_0^{\infty} f(x) \d x \\ &&&= \int_0^{\infty} \frac{C \, k^{a+1} \, x^a}{(x+k)^{2a+2}} \d x \\ &&&= Ck^{a+1} \int_0^{\infty} \frac{x^a}{(x+k)^{2a+2} }\d x \\ &&&= Ck^{a+1} \frac{a!(2a-a)!}{(2a+1)!k^{2a-a+1}} \\ &&&= C \frac{a!a!}{(2a+1)!} \\ \Rightarrow && C &= \frac{(2a+1)!}{a!a!} \end{align*} \begin{align*} && I &= \int_0^v \frac{x^a}{(x+k)^{2a+2}} \d x\\ u = k^2/x, \d x = -k^2u^{-2} \d u: &&&= \int_{u = +\infty}^{u = k^2/v} \frac{k^{2a}u^{-a}}{(k^2u^{-1} +k)^{2a+2}}(-k^2u^{-2}) \d u \\ &&&= \int_{u = +\infty}^{u = k^2/v} \frac{k^{2a-2a-2}u^{2a+2-a}}{(k +u)^{2a+2}}(-k^2u^{-2}) \d u \\ &&&= \int_{ k^2/v}^{\infty} \frac{u^{a}}{(k +u)^{2a+2}} \d u \\ \end{align*} At the median we want a value \(M\) such that \(M = k^2/M\) ie \(M = k\) \begin{align*} && \mathbb{E}(V) &= \int_0^{\infty} x f(x) \d x \\ &&&= \frac{(2a+1)!k^{a+1}}{a!a!} \int_0^{\infty} \frac{x^{a+1}}{(x+k)^{2a+2}} \d x \\ &&&= \frac{(2a+1)!k^{a+1}}{a!a!} \frac{(a+1)!(2a-(a+1))!}{(2a+1)!k^{2a-(a+1)+1}}\\ &&&= \frac{k^{a+1}}{a!} \frac{(a+1)(a-1)!}{k^{a}} \\ &&&= \frac{k(a+1)}{a} = \frac{a+1}a k \end{align*} \begin{align*} && \mathbb{P}(T < t) &= \mathbb{P}(\frac{s}{V} < t) \\ &&&= \mathbb{P}(V > \frac{s}{t}) \\ &&&= \int_{s/t}^{\infty} f(x) \d x \\ &&&= \int_{s/t}^{\infty} \frac{C \, k^{a+1} \, x^a}{(x+k)^{2a+2}} \d x \\ \\ \Rightarrow && f_T(t) &= \frac{\d}{\d t} \left ( \mathbb{P}(T < t)\right) \\ &&&= \frac{\d}{\d t} \left ( \int_{s/t}^{\infty} \frac{C \, k^{a+1} \, x^a}{(x+k)^{2a+2}} \d x \right) \\ &&&= - \frac{C \, k^{a+1} \, \left ( \frac{s}{t} \right)^a}{(\frac{s}{t}+k)^{2a+2}} \cdot \left (-\frac{s}{t^2} \right) \\ &&&= \frac{Ck^{a+1}s^{a+1}t^{2a+2}}{t^{a+2}(s+kt)^{2a+2}} \\ &&&= \frac{C(ks)^{a+1}t^a}{(s+kt)^{2a+2}} \\ &&&= \frac{C(\frac{s}{k})^{a+1}t^a}{(\frac{s}{k}+t)^{2a+2}} \end{align*} Therefore \(T\) follows the same distribution, but with parameter \(s/k\) rather than \(k\). In particular it has median \(s/k\) (and the product of the medians is \(s\)). However, the product of the expected time and expected speed is \(\frac{a+1}{a} k \frac{a+1}{a} \frac{s}{k} = \left ( \frac{a+1}{a} \right)^2s > s\)
Sketch the graph, for \(x \ge 0\,\), of $$ y = kx\e^{-ax^2} \;, $$ where \(a\) and \(k\) are positive constants. The random variable \(X\) has probability density function \(\f(x)\) given by \begin{equation*} \f(x)= \begin{cases} kx\e^{-ax^2} & \text{for \(0 \le x \le 1\)}\\[3pt] 0 & \text{otherwise}. \end{cases} \end{equation*} Show that \(\displaystyle k=\frac{2a}{1-\e^{-a}}\) and find the mode \(m\) in terms of \(a\,\), distinguishing between the cases \(a < \frac12\) and \(a > \frac12\,\). Find the median \(h\) in terms of \(a\), and show that \(h > m\) if \(a > -\ln\left(2\e^{-1/2} - 1\right).\) Show that, \(-\ln\left(2\e^{-1/2}-1\right)> \frac12 \,\). Show also that, if \(a > -\ln\left(2\e^{-1/2} - 1\right) \,\), then $$ P(X > m \;\vert\; X < h) = {{2\e^{-1/2}-\e^{-a}-1} \over 1-\e^{-a}}\;. $$
Solution:
Brief interruptions to my work occur on average every ten minutes and the number of interruptions in any given time period has a Poisson distribution. Given that an interruption has just occurred, find the probability that I will have less than \(t\) minutes to work before the next interruption. If the random variable \(T\) is the time I have to work before the next interruption, find the probability density function of \(T\,\). I need an uninterrupted half hour to finish an important paper. Show that the expected number of interruptions before my first uninterrupted period of half an hour or more is \(\e^3-1\). Find also the expected length of time between interruptions that are less than half an hour apart. Hence write down the expected wait before my first uninterrupted period of half an hour or more.
A random variable \(X\) is distributed uniformly on \([\, 0\, , \, a\,]\). Show that the variance of \(X\) is \({1 \over 12} a^2\). A sample, \(X_1\) and \(X_2\), of two independent values of the random variable is drawn, and the variance \(V\) of the sample is determined. Show that \(V = {1 \over 4} \l X_1 -X_2 \r ^2\), and hence prove that \(2 V\) is an unbiased estimator of the variance of X. Find an exact expression for the probability that the value of \(V\) is less than \({1 \over 12} a^2\) and estimate the value of this probability correct to one significant figure.
Solution: \begin{align*} && \E[X] &= \frac{a}{2}\tag{by symmetry} \\ &&\E[X^2] &= \int_0^a \frac{1}{a} x^2 \d x \\ &&&= \frac{a^3}{3a} = \frac{a^2}{3} \\ \Rightarrow && \var[X] &= \frac{a^2}{3} - \frac{a^2}{4} = \frac{a^2}{12} \\ \end{align*} \begin{align*} && V &=\frac{1}{2} \left ( \left ( X_1 - \frac{X_1+X_2}{2} \right )^2+\left ( X_2- \frac{X_1+X_2}{2} \right )^2 \right ) \\ &&&= \frac{1}{8} ((X_1 - X_2)^2 + (X_2 - X_1)^2 ) \\ &&&= \frac14 (X_1-X_2)^2 \\ \\ && \E[2V] &= \E \left [ \frac12 (X_1 - X_2)^2 \right] \\ &&&= \frac12 \E[X_1^2] - \E[X_1X_2] + \frac12 \E[X_2^2] \\ &&&= \frac{a^2}{3} - \frac{a^2}{4} = \frac{a^2}{12} \end{align*} Therefore \(2V\) is an unbiased estimator of the variance of \(X\).
A stick is broken at a point, chosen at random, along its length. Find the probability that the ratio, \(R\), of the length of the shorter piece to the length of the longer piece is less than \(r\). Find the probability density function for \(R\), and calculate the mean and variance of \(R\).
Solution: Let \(X \sim U[0, \tfrac12]\) be the shorter piece, so \(R = \frac{X}{1-X}\), and \begin{align*} && \mathbb{P}(R \leq r) &= \mathbb{P}(\tfrac{X}{1-X} \leq r) \\ &&&= \mathbb{P}(X \leq r - rX) \\ &&&= \mathbb{P}((1+r)X \leq r) \\ &&&= \mathbb{P}(X \leq \tfrac{r}{1+r} ) \\ &&&= \begin{cases} 0 & r < 0 \\ \frac{2r}{1+r} & 0 \leq r \leq 1 \\ 1 & r > 1 \end{cases} \\ \\ && f_R(r) &= \begin{cases} \frac{2}{(1+r)^2} & 0 \leq r \leq 1 \\ 0 & \text{otherwise} \end{cases} \end{align*} Let \(Y \sim U[\tfrac12, 1]\) be the longer piece, then \(R = \frac{1-Y}{Y} = Y^{-1} - 1\) and \begin{align*} \E[R] &= \int_{\frac12}^1 (y^{-1}-1) 2 \d y \\ &= 2\left [\ln y - y \right]_{\frac12}^1 \\ &= -2 + 2\ln2 +2\frac12 \\ &= 2\ln2 -1 \\ \\ \E[R^2] &= \int_{\frac12}^1 (y^{-1}-1)^2 2 \d y\\ &= 2\left [-y^{-1} -2\ln y + 1 \right]_{\frac12}^1 \\ &= 2 \left ( 2 - 2\ln 2+\frac12\right) \\ &= 3-4\ln 2 \\ \var[R] &= 3 - 4 \ln 2 -(2\ln 2-1)^2 \\ &= 2 - 4(\ln 2)^2 \end{align*}
A random variable \(X\) has the probability density function \[ \mathrm{f}(x)=\begin{cases} \lambda\mathrm{e}^{-\lambda x} & x\geqslant0,\\ 0 & x<0. \end{cases} \] Show that $${\rm P}(X>s+t\,\vert X>t) = {\rm P}(X>s).$$ The time it takes an assistant to serve a customer in a certain shop is a random variable with the above distribution and the times for different customers are independent. If, when I enter the shop, the only two assistants are serving one customer each, what is the probability that these customers are both still being served at time \(t\) after I arrive? One of the assistants finishes serving his customer and immediately starts serving me. What is the probability that I am still being served when the other customer has finished being served?
Solution: \begin{align*} && \mathbb{P}(X > t) &= \int_t^{\infty} \lambda e^{-\lambda x} \d x\\ &&&= \left[ -e^{-\lambda x} \right]_t^\infty \\ &&&= e^{-\lambda t}\\ \\ && \mathbb{P}(X > s + t | X > t) &= \frac{\mathbb{P}(X > s + t)}{\mathbb{P}(X > t)} \\ &&&= \frac{e^{-(s+t)\lambda}}{e^{-t\lambda}} \\ &&&= e^{-s\lambda} = \mathbb{P}(X > s) \end{align*} The probability both are still being served (independently) is \(\mathbb{P}(X > t)^2 = e^{-2\lambda t}\). The probability is exactly \(\frac12\). The property we proved in the first part of the questions shows the distribution is memoryless, ie we are both experiencing samples from the same distribution. Therefore we are equally likely to finish first.