8 problems found
Let \(X\) be a random variable with mean \(\mu\) and standard deviation \(\sigma\). Chebyshev's inequality, which you may use without proof, is \[ \P\left(\vert X-\mu\vert > k\sigma\right) \le \frac 1 {k^2} \,, \] where \(k\) is any positive number.
Solution:
Let \(x_{\low1}\), \(x_{\low2}\), \ldots, \(x_n\) and \(x_{\vphantom {\dot A} n+1}\) be any fixed real numbers. The numbers \(A\) and \(B\) are defined by \[ A = \frac 1 n \sum_{k=1}^n x_{ \low k} \,, \ \ \ B= \frac 1 n \sum_{k=1}^n (x_{\low k}-A)^2 \,, \ \ \ \] and the numbers \(C\) and \(D\) are defined by \[ C = \frac 1 {n+1} \sum\limits_{k=1}^{n+1} x_{\low k} \,, \ \ \ D = \frac1{n+1} \sum_{k=1}^{n+1} (x_{\low k}-C)^2 \,. \]
The probability density function \(\f(x)\) of the random variable \(X\) is given by $$\f(x) = k\left[{\phi}(x) + {\lambda}\g(x)\right]$$ where \({\phi}(x)\) is the probability density function of a normal variate with mean 0 and variance 1, \(\lambda \) is a positive constant, and \(\g(x)\) is a probability density function defined by \[ \g(x)= \begin{cases} 1/\lambda & \mbox{for \(0 \le x \le {\lambda}\)}\,;\\ 0& \mbox{otherwise} . \end{cases} \] Find \(\mu\), the mean of \(X\), in terms of \(\lambda\), and prove that \(\sigma\), the standard deviation of \(X\), satisfies. $$\sigma^2 = \frac{\lambda^4 +4{\lambda}^3+12{\lambda}+12} {12(1 + \lambda )^2}\;.$$ In the case \(\lambda=2\):
Solution: \begin{align*} && 1 &= \int_{-\infty}^{\infty} f(x) \d x \\ &&&= k[1 + \lambda] \\ \Rightarrow && k &= \frac{1}{1+\lambda} \\ \\ && \mu &= \int_{-\infty}^\infty x f(x) \d x \\ &&&= k \int_{-\infty}^\infty x \phi(x) \d x + k \lambda \int_{-\infty}^{\infty} x g(x) \d x \\ &&&= k \cdot 0 + k \lambda \cdot \frac{\lambda}{2} \\ &&&= \frac{\lambda^2}{2(1+\lambda)} \\ \\ && \E[X^2] &= \int_{-\infty}^\infty x^2 f(x) \d x \\ &&&= k \int_{-\infty}^\infty x^2 \phi(x) \d x + k \lambda \int_{-\infty}^{\infty} x^2 g(x) \d x \\ &&&= k \cdot 1 + k \lambda \int_0^{\lambda} \frac{x^2}{\lambda} \d \lambda \\ &&&= k + \frac{k \lambda^3}{3} \\ &&&= \frac{3+\lambda^3}{3(1+\lambda)} \\ && \var[X] &= \frac{3+\lambda^3}{3(1+\lambda)} - \frac{\lambda^4}{4(1+\lambda)^2} \\ &&& = \frac{(3+\lambda^3)4(1+\lambda) - 3\lambda^4}{12(1+\lambda)^2} \\ &&&= \frac{\lambda^4+4\lambda^3+12\lambda + 12}{12(1+\lambda)^2} \end{align*}
The random variable \(X\) takes the values \(k=1\), \(2\), \(3\), \(\dotsc\), and has probability distribution $$ \P(X=k)= A{{{\lambda}^k\e^{-{\lambda}}} \over {k!}}\,, $$ where \(\lambda \) is a positive constant. Show that \(A = (1-\e^{-\lambda})^{-1}\,\). Find the mean \({\mu}\) in terms of \({\lambda}\) and show that $$ \var(X) = {\mu}(1-{\mu}+{\lambda})\;. $$ Deduce that \({\lambda} < {\mu} < 1+{\lambda}\,\). Use a normal approximation to find the value of \(P(X={\lambda})\) in the case where \({\lambda}=100\,\), giving your answer to 2 decimal places.
Solution: Let \(Y \sim Po(\lambda)\) \begin{align*} && 1 &= \sum_{k=1}^\infty \mathbb{P}(X = k ) \\ &&&= \sum_{k=1}^\infty A \frac{\lambda^k e^{-\lambda}}{k!}\\ &&&= Ae^{-\lambda} \sum_{k=1}^{\infty} \frac{\lambda^k e^{-\lambda}}{k!} \\ &&&= Ae^{-\lambda} \left (e^{\lambda}-1 \right) \\ \Rightarrow && A &= (1-e^{-\lambda})^{-1} \\ \\ && \E[X] &= \sum_{k=1}^{\infty} k \cdot \mathbb{P}(X=k) \\ &&&= A\sum_{k=1}^{\infty} k \frac{\lambda^k e^{-\lambda}}{k!} \\ &&&= A\E[Y] = A\lambda = \lambda(1-e^{-\lambda})^{-1} \\ \\ && \var[X] &= \E[X^2] - (\E[X])^2 \\ &&&= A\sum_{k=1}^{\infty} k^2 \frac{\lambda^k e^{-\lambda}}{k!} - \mu^2 \\ &&&= A\E[Y^2] - \mu^2 \\ &&&= A(\var[Y]+\lambda^2) - \mu^2 \\ &&&= A(\lambda + \lambda^2) - \mu^2 \\ &&&= A\lambda(1+\lambda) - \mu^2 \\ &&&= \mu(1+\lambda - \mu) \end{align*} Since \(A > 1\) we must have \(\mu > \lambda\) and since \(\var[X] > 0\) we must have \(1 + \lambda > \mu\) as required. If \(\lambda = 100\), then \(A \approx 1\) and \(P(X=\lambda) \approx P(Y = \lambda)\) and \(Y \approx N(\lambda, \lambda)\) so the value is approximately \(\displaystyle \int_{-\frac12}^{\frac12} \frac{1}{\sqrt{2\pi \lambda}} e^{-\frac{x^2}{2\lambda}} \d x \approx \frac{1}{\sqrt{200\pi}} = \frac{1}{\sqrt{630.\ldots}} \approx \frac{1}{25} = 0.04 \)
A stick is broken at a point, chosen at random, along its length. Find the probability that the ratio, \(R\), of the length of the shorter piece to the length of the longer piece is less than \(r\). Find the probability density function for \(R\), and calculate the mean and variance of \(R\).
Solution: Let \(X \sim U[0, \tfrac12]\) be the shorter piece, so \(R = \frac{X}{1-X}\), and \begin{align*} && \mathbb{P}(R \leq r) &= \mathbb{P}(\tfrac{X}{1-X} \leq r) \\ &&&= \mathbb{P}(X \leq r - rX) \\ &&&= \mathbb{P}((1+r)X \leq r) \\ &&&= \mathbb{P}(X \leq \tfrac{r}{1+r} ) \\ &&&= \begin{cases} 0 & r < 0 \\ \frac{2r}{1+r} & 0 \leq r \leq 1 \\ 1 & r > 1 \end{cases} \\ \\ && f_R(r) &= \begin{cases} \frac{2}{(1+r)^2} & 0 \leq r \leq 1 \\ 0 & \text{otherwise} \end{cases} \end{align*} Let \(Y \sim U[\tfrac12, 1]\) be the longer piece, then \(R = \frac{1-Y}{Y} = Y^{-1} - 1\) and \begin{align*} \E[R] &= \int_{\frac12}^1 (y^{-1}-1) 2 \d y \\ &= 2\left [\ln y - y \right]_{\frac12}^1 \\ &= -2 + 2\ln2 +2\frac12 \\ &= 2\ln2 -1 \\ \\ \E[R^2] &= \int_{\frac12}^1 (y^{-1}-1)^2 2 \d y\\ &= 2\left [-y^{-1} -2\ln y + 1 \right]_{\frac12}^1 \\ &= 2 \left ( 2 - 2\ln 2+\frac12\right) \\ &= 3-4\ln 2 \\ \var[R] &= 3 - 4 \ln 2 -(2\ln 2-1)^2 \\ &= 2 - 4(\ln 2)^2 \end{align*}
Each of my \(n\) students has to hand in an essay to me. Let \(T_{i}\) be the time at which the \(i\)th essay is handed in and suppose that \(T_{1},T_{2},\ldots,T_{n}\) are independent, each with probability density function \(\lambda\mathrm{e}^{-\lambda t}\) (\(t\geqslant0\)). Let \(T\) be the time I receive the first essay to be handed in and let \(U\) be the time I receive the last one.
Solution:
A fair coin is thrown \(n\) times. On each throw, 1 point is scored for a head and 1 point is lost for a tail. Let \(S_{n}\) be the points total for the series of \(n\) throws, i.e. \(S_{n}=X_{1}+X_{2}+\cdots+X_{n},\) where \[ X_{j}=\begin{cases} 1 & \text{ if the }j \text{ th throw is a head}\\ -1 & \text{ if the }j\text{ th throw is a tail.} \end{cases} \]
Solution: Notice that \(\mathbb{E}(X_i) = 0, \mathbb{E}(X_i^2) = 1\) and so \(\mathbb{E}(S_n) =0, \textrm{Var}(S_n) = n\).
A point \(P\) is chosen at random (with uniform distribution) on the circle \(x^{2}+y^{2}=1\). The random variable \(X\) denotes the distance of \(P\) from \((1,0)\). Find the mean and variance of \(X\). Find also the probability that \(X\) is greater than its mean.
Solution: Consider the angle from the origin, then \(P = (\cos \theta, \sin \theta)\) where \(\theta \sim U(0, 2\pi)\), and \(X = \sqrt{(\cos \theta - 1)^2 + \sin^2 \theta}\) \begin{align*} \mathbb{E}[X] &= \int_0^{2\pi} \sqrt{(\cos \theta - 1)^2 + \sin^2 \theta} \frac1{2\pi} \d \theta \\ &= \frac1{2\pi}\int_0^{2\pi} \sqrt{2 - 2\cos \theta} \d \theta \\ &= \frac{1}{2\pi}\int_0^{2\pi} \sqrt{4\sin^2 \frac{\theta}{2}} \d \theta \\ &= \frac{1}{\pi}\int_0^{2\pi} \left |\sin \frac{\theta}{2} \right| \d \theta \\ &= \frac{1}{\pi} \left [ -2\cos \frac{\theta}{2} \right]_0^{2\pi} \\ &= \frac1{\pi} \l 2 + 2\r \\ &= \frac{4}{\pi} \end{align*} \begin{align*} \mathbb{E}(X^2) &= \frac1{2\pi}\int_0^{2\pi} (\cos \theta - 1)^2 + \sin^2 \theta \d \theta \\ &= \frac1{2\pi}\int_0^{2\pi} 2 - 2 \cos \theta \d \theta \\ &= \frac{4\pi}{2\pi} \\ &= 2 \\ \end{align*} \(\Rightarrow\) \(\mathrm{Var}(X) = \mathbb{E}(X^2) - \mathbb{E}(X)^2 = 2 - \frac{16}{\pi^2} = \frac{2\pi^2 - 16}{\pi^2}\).