22 problems found
In this question, you may use without proof the results \[ \sum_{i=1}^{n} i^2 = \tfrac{1}{6}n(n+1)(2n+1) \quad \text{and} \quad \sum_{i=1}^{n} i^3 = \tfrac{1}{4}n^2(n+1)^2. \] Throughout the question, \(n\) and \(k\) are integers with \(n \geqslant 3\) and \(k \geqslant 2\).
Solution:
The continuous random variable \(X\) has probability density function \[ f(x) = \begin{cases} \lambda e^{-\lambda x} & \text{for } x \geqslant 0, \\ 0 & \text{otherwise,} \end{cases} \] where \(\lambda\) is a positive constant. The random variable \(Y\) is the greatest integer less than or equal to \(X\), and \(Z = X - Y\).
Solution:
Solution:
\(A\) and \(B\) both toss the same biased coin. The probability that the coin shows heads is \(p\), where \(0 < p < 1\), and the probability that it shows tails is \(q = 1 - p\). Let \(X\) be the number of times \(A\) tosses the coin until it shows heads. Let \(Y\) be the number of times \(B\) tosses the coin until it shows heads.
The discrete random variables \(X\) and \(Y\) can each take the values \(1\), \(\ldots\,\), \(n\) (where \(n\ge2\)). Their joint probability distribution is given by \[ \P(X=x, \ Y=y) = k(x+y) \,, \] where \(k\) is a constant.
Solution:
An internet tester sends \(n\) e-mails simultaneously at time \(t=0\). Their arrival times at their destinations are independent random variables each having probability density function \(\lambda \e^{-\lambda t}\) (\(0\le t<\infty\), \( \lambda >0\)).
Solution:
The random variable \(U\) has a Poisson distribution with parameter \(\lambda\). The random variables \(X\) and \(Y\) are defined as follows. \begin{align*} X&= \begin{cases} U & \text{ if \(U\) is 1, 3, 5, 7, \(\ldots\,\)} \\ 0 & \text{ otherwise} \end{cases} \\ Y&= \begin{cases} U & \text{ if \(U\) is 2, 4, 6, 8, \(\ldots\,\) } \\ 0 & \text{ otherwise} \end{cases} \end{align*}
Solution:
Solution:
Oxtown and Camville are connected by three roads, which are at risk of being blocked by flooding. On two of the three roads there are two sections which may be blocked. On the third road there is only one section which may be blocked. The probability that each section is blocked is \(p\). Each section is blocked independently of the other four sections. Show that the probability that Oxtown is cut off from Camville is \(p^3 \l 2-p \r^2\). I want to travel from Oxtown to Camville. I choose one of the three roads at random and find that my road is not blocked. Find the probability that I would not have reached Camville if I had chosen either of the other two roads. You should factorise your answer as fully as possible. Comment briefly on the value of this probability in the limit \(p\to1\).
For any random variables \(X_1\) and \(X_2\), state the relationship between \(\E(aX_1+bX_2)\) and \(\E(X_1)\) and \(\E(X_2)\), where \(a\) and \(b\) are constants. If \(X_1\) and \(X_2\) are independent, state the relationship between \(\E(X_1X_2)\) and \(\E(X_1)\) and \(\E(X_2)\). An industrial process produces rectangular plates. The length and the breadth of the plates are modelled by independent random variables \(X_1\) and \(X_2\) with non-zero means \(\mu_1\) and \(\mu_2\) and non-zero standard deviations \(\sigma_1\) and \(\sigma_2\), respectively. Using the results in the paragraph above, and without quoting a formula for \(\var(aX_1+bX_2)\), find the means and standard deviations of the perimeter \(P\) and area \(A\) of the plates. Show that \(P\) and \(A\) are not independent. The random variable \(Z\) is defined by \(Z=P-\alpha A\), where \(\alpha \) is a constant. Show that \(Z\) and \(A\) are not independent if \[ \alpha \ne \dfrac{2(\mu_1^{\vphantom2} \sigma_2^2 +\mu_2^{\vphantom2}\sigma_1^2)} { \mu_1^2 \sigma_2^2 +\mu_2^2\sigma_1^2 + \sigma_1^2\sigma_2^2 } \;. \] Given that \(X_1\) and \(X_2\) can each take values 1 and 3 only, and that they each take these values with probability \(\frac 12\), show that \(Z\) and \(A\) are not independent for any value of \(\alpha\).
Solution: \(\E(aX_1+bX_2) = a \E(X_1) + b\E(X_2)\) for any \(X_1, X_2\) \(\E(X_1X_2)=\E(X_1)\E(X_2)\). if \(X_1, X_2\) are independent. \begin{align*} && \E(P) &= \E(2(X_1+X_2)) = 2(\E[X_1]+\E[X_2]) \\ &&&= 2(\mu_1 + \mu_2) \\ && \var(P) &= \E[\left ( 2(X_1+X_2) \right)^2] - \E[2(X_1+X_2)]^2 \\ &&&= 4\E[X_1^2+2X_1X_2+X_2^2] -4(\mu_1 + \mu_2)^2 \\ &&&= 4(\mu_1^2 + \sigma_1^2 + 2\mu_1\mu_2 + \mu_2^2 + \sigma_2^2) - 4(\mu_1 + \mu_2)^2 \\ &&&= 4(\sigma_1^2+\sigma_2^2) \\ && \textrm{SD}(P) &= 2 \sqrt{\sigma_1^2+\sigma_2^2}\\ \\ && \E(A) &= \E[X_1X_2] = \E[X_1]\E[X_2] \\ &&&= \mu_1\mu_2 \\ && \var(A) &= \E[(X_1X_2)^2] - (\mu_1\mu_2)^2 \\ &&&= (\mu_1^2+\sigma_1^2)(\mu_2^2+\sigma_2^2) - (\mu_1\mu_2)^2\\ &&&= \mu_1^2 \sigma_2^2 + \mu_2^2 \sigma_1^2 + \sigma_1^2 \sigma_2^2\\ && \textrm{SD}(A) &= \sqrt{\mu_1^2 \sigma_2^2 + \mu_2^2 \sigma_1^2 + \sigma_1^2 \sigma_2^2} \end{align*} \begin{align*} \E[PA] &= \E[2(X_1+X_2)X_1X_2] \\ &= 2\E[X_1^2X_2] + 2\E[X_1X_2^2]\\ &= 2(\mu_1^2 + \sigma_1^2)\mu_2 + 2\mu_1 (\mu_2^2+\sigma_2^2)\\ &\neq 2(\mu_1 + \mu_2)\mu_1\mu_2 \\ &= \E[P]\E[A] \end{align*} \begin{align*} && \E[Z] &= \E[P] - \alpha \E[A] \\ &&&= 2(\mu_1+\mu_2) - \alpha \mu_1 \mu_2 \\ \\ && \E[ZA] &= \E[PA - \alpha A^2] \\ &&&= 2(\mu_1^2 + \sigma_1^2)\mu_2 + 2\mu_1 (\mu_2^2+\sigma_2^2) - \alpha \E[A^2] \\ &&&= 2(\mu_1^2 + \sigma_1^2)\mu_2 + 2\mu_1 (\mu_2^2+\sigma_2^2) - \alpha \E[X_1^2]\E[X_2^2] \\ &&&= 2(\mu_1^2 + \sigma_1^2)\mu_2 + 2\mu_1 (\mu_2^2+\sigma_2^2) - \alpha (\mu_1^2+\sigma_1^2)(\mu_2^2+\sigma_2^2) \\ \text{if ind.} && \E[Z]\E[A] &= \E[ZA]\\ && (2(\mu_1+\mu_2) - \alpha \mu_1 \mu_2) \mu_1\mu_2 &= 2(\mu_1^2 + \sigma_1^2)\mu_2 + 2\mu_1 (\mu_2^2+\sigma_2^2) - \alpha (\mu_1^2+\sigma_1^2)(\mu_2^2+\sigma_2^2) \\ \Rightarrow && 2(\mu_1^2\mu_2+\mu_1\mu_2^2) - \alpha \mu_1^2\mu_2^2 &= 2(\mu_1^2\mu_2+\mu_1\mu_2^2) + 2\sigma_1^2\mu_2 + 2\sigma_2^2\mu_1 - \alpha (\mu_1^2+\sigma_1^2)(\mu_2^2+\sigma_2^2) \\ \Rightarrow && \alpha ((\mu_1^2+\sigma_1^2)(\mu_2^2+\sigma_2^2) - \mu_1^2\mu_2^2) &= 2(\sigma_1^2\mu_2 + \sigma_2^2\mu_1) \\ \Rightarrow && \alpha &= \frac{ 2(\sigma_1^2\mu_2 + \sigma_2^2\mu_1) }{\mu_1^2 \sigma_2^2 + \mu_2^2 \sigma_1^2 + \sigma_1^2 \sigma_2^2} \end{align*} Therefore if they are not independent if \(\alpha \neq \) the expression. \begin{array}{c|c|c|c|c|c} & X_1 & X_2 & A & P & Z \\ \hline 0.25 & 1 & 1 & 1 & 4 & 4-\alpha \\ 0.25 & 1 & 3 & 3 & 8 & 8-3\alpha \\ 0.25 & 3 & 1 & 3 & 8 & 8-3\alpha \\ 0.25 & 3 & 3 & 9 & 12 & 12-9\alpha \\ \end{array} If \(\mathbb{P}(A = 1, Z = 4-\alpha) = \mathbb{P}(A = 1)\mathbb{P}(Z = 4-\alpha)\) then \(\mathbb{P}(Z = 4-\alpha) = 1\), but that mean \(4-\alpha = 8-3\alpha = 12-9\alpha\) which is not a consistent set of equations as the first two are solved by \(\alpha = 2\) and the second by \(\alpha = \frac23\)
The twins Anna and Bella share a computer and never sign their e-mails. When I e-mail them, only the twin currently online responds. The probability that it is Anna who is online is \(p\) and she answers each question I ask her truthfully with probability \(a\), independently of all her other answers, even if a question is repeated. The probability that it is Bella who is online is~\(q\), where \(q=1-p\), and she answers each question truthfully with probability \(b\), independently of all her other answers, even if a question is repeated.
A men's endurance competition has an unlimited number of rounds. In each round, a competitor has, independently, a probability \(p\) of making it through the round; otherwise, he fails the round. Once a competitor fails a round, he drops out of the competition; before he drops out, he takes part in every round. The grand prize is awarded to any competitor who makes it through a round which all the other remaining competitors fail; if all the remaining competitors fail at the same round the grand prize is not awarded. If the competition begins with three competitors, find the probability that:
Solution:
A continuous random variable is said to have an exponential distribution with parameter \(\lambda\) if its density function is \(\f(t) = \lambda \e ^{- \lambda t} \; \l 0 \le t < \infty \r\,\). If \(X_1\) and \(X_2\), which are independent random variables, have exponential distributions with parameters \(\lambda_1\) and \(\lambda_2\) respectively, find an expression for the probability that either \(X_1\) or \(X_2\) (or both) is less than \(x\). Prove that if \(X\) is the random variable whose value is the lesser of the values of \(X_1\) and \(X_2\), then \(X\) also has an exponential distribution. Route A and Route B buses run from my house to my college. The time between buses on each route has an exponential distribution and the mean time between buses is 15 minutes for Route A and 30 minutes for Route B. The timings of the buses on the two routes are independent. If I emerge from my house one day to see a Route A bus and a Route B bus just leaving the stop, show that the median wait for the next bus to my college will be approximately 7 minutes.
The random variable \(X\) is uniformly distributed on the interval \([-1,1]\). Find \(\E(X^2)\) and \(\var (X^2)\). A second random variable \(Y\), independent of \(X\), is also uniformly distributed on \([-1,1]\), and \(Z=Y-X\). Find \(\E(Z^2)\) and show that \(\var (Z^2) = 7 \var (X^2)\).
Solution: \(X \sim U(-1,1)\) \begin{align*} \E[X^2] &= \int_{-1}^1 \frac12 x^2 \, dx \\ &= \frac{1}{6} \left [ x^3 \right]_{-1}^1 \\ &= \frac{1}{3} \end{align*} \begin{align*} \E[X^4] &= \int_{-1}^1 \frac12 x^4 \, dx \\ &= \frac{1}{10} \left [ x^5 \right]_{-1}^1 \\ &= \frac{1}{5} \end{align*} \begin{align*} \var[X^2] &=\E[X^4] - \E[X^2]^2 \\ &= \frac{1}{5} - \frac{1}{9} \\ &= \frac{4}{45} \end{align*} \begin{align*} \E(Z^2) &= \E(Y^2 - 2XY+Z^2) \\ &= \E(Y^2) - 2\E(X)\E(Y)+\E(Z^2) \\ &= \frac{1}{3} - 0 + \frac{1}{3} \\ &= \frac{2}{3} \end{align*} \begin{align*} \E[Z^4] &= \E[Y^4 -4Y^3X+6Y^2X^2-4YX^3+X^4] \\ &= \E[Y^4]-4\E[Y^3]\E[X]+6\E[Y^2]\E[X^2]-4\E[Y]\E[X^3]+\E[X^4] \\ &= \frac{1}{5}+6 \frac{1}{3} \frac13 + \frac{1}{5} \\ &= \frac{2}{5} + \frac{2}{3} \\ &= \frac{16}{15} \end{align*} \begin{align*} \var(Z^2) &= \E(Z^4) - \E(Z^2) \\ &= \frac{16}{15} - \frac{4}{9} \\ &= \frac{28}{45} \\ &= 7 \var(X^2) \end{align*}
The random variable \(X\) takes only the values \(x_1\) and \(x_2\) (where \( x_1 \not= x_2 \)), and the random variable \(Y\) takes only the values \(y_1\) and \(y_2\) (where \(y_1 \not= y_2\)). Their joint distribution is given by $$ \P ( X = x_1 , Y = y_1 ) = a \ ; \ \ \P ( X = x_1 , Y = y_2 ) = q - a \ ; \ \ \P ( X = x_2 , Y = y_1 ) = p - a \ . $$ Show that if \(\E(X Y) = \E(X)\E(Y)\) then $$ (a - p q ) ( x_1 - x_2 ) ( y_1 - y_2 ) = 0 . $$ Hence show that two random variables each taking only two distinct values are independent if \(\E(X Y) = \E(X) \E(Y)\). Give a joint distribution for two random variables \(A\) and \(B\), each taking the three values \(- 1\), \(0\) and \(1\) with probability \({1 \over 3}\), which have \(\E(A B) = \E( A)\E (B)\), but which are not independent.
Solution: \begin{align*} \mathbb{P}(X = x_1) &= a + q - a = q \\ \mathbb{P}(X = x_2) &= 1 - q \\ \mathbb{P}(Y = y_1) & = a + p - a = p \\ \mathbb{P}(Y = y_2) & = 1 - p \end{align*} \begin{align*} \mathbb{E}(X)\mathbb{E}(Y) &= \l qx_1 + (1-q)x_2 \r \l p y_1 + (1-p)y_2\r \\ &= qpx_1y_1 + q(1-p)x_1y_2 + (1-q)px_2y_1 + (1-q)(1-p)x_2y_2 \\ \mathbb{E}(XY) &= ax_1y_1 + (q-a)x_1y_2 + (p-a)x_2y_1 + (1 + a - p - q)x_2y_2 &= \end{align*} Therefore \(\mathbb{E}(XY) - \mathbb{E}(X)\mathbb{E}(Y)\) is a degree 2 polynomial in the \(x_i, y_i\). If \(x_1 = x_2\) then we have: \begin{align*} \mathbb{E}(X)\mathbb{E}(Y) &=x_1 \l p y_1 + (1-p)y_2\r \\ \mathbb{E}(XY) &= x_1(ay_1 + (q-a)y_2 + (p-a)y_1 + (1 + a - p - q)y_2) \\ &= x_1 (py_1 + (1-p)y_2) \end{align*} Therefore \(x_1 - x_2\) is a root and by symmetry \(y_1 - y_2\) is a root. Therefore it remains to check the coefficient of \(x_1y_1\) which is \(a - pq\) to complete the factorisation. For any two random variables taking two distinct values, we can find \(a, q, p\) satisfying the relations above. We also note that \(X\) and \(Y\) are independent if \(\mathbb{P}(X = x_i, Y = y_i) = \mathbb{P}(X = x_i)\mathbb{P}(Y = y_i)\). Since \(x_1 \neq x_2\) and \(y_1 \neq y_2\) and \(\E(A B) = \E( A)\E (B) \Rightarrow a = pq\). But if \(a = pq\), we have \(\mathbb{P}(X = x_1, Y = y_1) = \mathbb{P}(X = x_1)\mathbb{P}(Y = y_1)\) and all the other relations drop out similarly. Consider \begin{align*} \mathbb{P}(A = -1, B = 1) &= \frac{1}{6} \\ \mathbb{P}(A = -1, B = -1) &= \frac{1}{6} \\ \mathbb{P}(A = 0, B = 0) &= \frac{1}{3} \\ \mathbb{P}(A = 1, B = -1) &= \frac{1}{6} \\ \mathbb{P}(A = -1, B = -1) &= \frac{1}{6} \end{align*}