Problems

Filters
Clear Filters

6 problems found

2020 Paper 3 Q12
D: 1500.0 B: 1500.0

\(A\) and \(B\) both toss the same biased coin. The probability that the coin shows heads is \(p\), where \(0 < p < 1\), and the probability that it shows tails is \(q = 1 - p\). Let \(X\) be the number of times \(A\) tosses the coin until it shows heads. Let \(Y\) be the number of times \(B\) tosses the coin until it shows heads.

  1. The random variable \(S\) is defined by \(S = X + Y\) and the random variable \(T\) is the maximum of \(X\) and \(Y\). Find an expression for \(\mathrm{P}(S = s)\) and show that \[ \mathrm{P}(T = t) = pq^{t-1}(2 - q^{t-1} - q^t). \]
  2. The random variable \(U\) is defined by \(U = |X - Y|\), and the random variable \(W\) is the minimum of \(X\) and \(Y\). Find expressions for \(\mathrm{P}(U = u)\) and \(\mathrm{P}(W = w)\).
  3. Show that \(\mathrm{P}(S = 2 \text{ and } T = 3) \neq \mathrm{P}(S = 2) \times \mathrm{P}(T = 3)\).
  4. Show that \(U\) and \(W\) are independent, and show that no other pair of the four variables \(S\), \(T\), \(U\) and \(W\) are independent.

2017 Paper 3 Q12
D: 1700.0 B: 1500.2

The discrete random variables \(X\) and \(Y\) can each take the values \(1\), \(\ldots\,\), \(n\) (where \(n\ge2\)). Their joint probability distribution is given by \[ \P(X=x, \ Y=y) = k(x+y) \,, \] where \(k\) is a constant.

  1. Show that \[ \P(X=x) = \dfrac{n+1+2x}{2n(n+1)}\,. \] Hence determine whether \(X\) and \(Y\) are independent.
  2. Show that the covariance of \(X\) and \(Y\) is negative.


Solution:

  1. \(\,\) \begin{align*} && \mathbb{P}(X = x) &= \sum_{y=1}^n \mathbb{P}(X=x,Y=y) \\ &&&= \sum_{y=1}^n k(x+y) \\ &&&= nkx + k\frac{n(n+1)}2 \\ \\ && 1 &= \sum_{x=1}^n \mathbb{P}(X=x) \\ &&&= nk\frac{n(n+1)}{2} + kn\frac{n(n+1)}2 \\ &&&= kn^2(n+1) \\ \Rightarrow && k &= \frac{1}{n^2(n+1)} \\ \Rightarrow && \mathbb{P}(X = x) &= \frac{nx}{n^2(n+1)} + \frac{n(n+1)}{2n^2(n+1)} \\ &&&= \frac{n+1+2x}{2n(n+1)} \\ \\ && \mathbb{P}(X=x)\mathbb{P}(Y=y) &= \frac{(n+1)^2+2(n+1)(x+y)+4xy}{4n^2(n+1)^2} \\ &&&\neq \frac{x+y}{n^2(n+1)} \end{align*} Therefore \(X\) and \(Y\) are not independent.
  2. \(\,\) \begin{align*} && \E[X] &= \sum_{x=1}^n x \mathbb{P}(X=x) \\ &&&= \sum_{x=1}^n x \mathbb{P}(X=x)\\ &&&= \sum_{x=1}^n x \frac{n+1+2x}{2n(n+1)} \\ &&&= \frac{1}{2n(n+1)} \left ( (n+1) \sum x + 2\sum x^2\right)\\ &&&= \frac{1}{2n(n+1)} \left ( \frac{n(n+1)^2}{2} + \frac{n(n+1)(2n+1)}{3} \right) \\ &&&= \frac{1}{2} \left ( \frac{n+1}{2} + \frac{2n+1}{3} \right)\\ &&&= \frac{1}{2} \left ( \frac{7n+5}{6} \right)\\ &&&= \frac{7n+5}{12} \\ \\ && \textrm{Cov}(X,Y) &= \mathbb{E}\left[XY\right] - \E[X] \E[Y] \\ &&&= \sum_{x=1}^n \sum_{y=1}^n xy \frac{x+y}{n^2(n+1)} - \E[X]^2 \\ &&&= \frac{1}{n^2(n+1)} \sum \sum (x^2 y+xy^2) - \E[X]^2 \\ &&&= \frac{1}{n^2(n+1)} \left (\sum y \right )\left (\sum x^2\right ) - \E[X]^2 \\ &&&=\frac{(n+1)(2n+1)}{12} - \left ( \frac{7n+5}{12}\right)^2 \\ &&&= \frac1{144} \left (12(2n^2+3n+1) - (49n^2+70n+25) \right)\\ &&&= \frac{1}{144} \left (-25n^2-34n-13 \right) \\ &&& < 0 \end{align*} since \(\Delta = 34^2 - 4 \cdot 25 \cdot 13 = 4(17^2-25 \times 13) = -4 \cdot 36 < 0\)

2000 Paper 3 Q14
D: 1700.0 B: 1500.0

The random variable \(X\) takes only the values \(x_1\) and \(x_2\) (where \( x_1 \not= x_2 \)), and the random variable \(Y\) takes only the values \(y_1\) and \(y_2\) (where \(y_1 \not= y_2\)). Their joint distribution is given by $$ \P ( X = x_1 , Y = y_1 ) = a \ ; \ \ \P ( X = x_1 , Y = y_2 ) = q - a \ ; \ \ \P ( X = x_2 , Y = y_1 ) = p - a \ . $$ Show that if \(\E(X Y) = \E(X)\E(Y)\) then $$ (a - p q ) ( x_1 - x_2 ) ( y_1 - y_2 ) = 0 . $$ Hence show that two random variables each taking only two distinct values are independent if \(\E(X Y) = \E(X) \E(Y)\). Give a joint distribution for two random variables \(A\) and \(B\), each taking the three values \(- 1\), \(0\) and \(1\) with probability \({1 \over 3}\), which have \(\E(A B) = \E( A)\E (B)\), but which are not independent.


Solution: \begin{align*} \mathbb{P}(X = x_1) &= a + q - a = q \\ \mathbb{P}(X = x_2) &= 1 - q \\ \mathbb{P}(Y = y_1) & = a + p - a = p \\ \mathbb{P}(Y = y_2) & = 1 - p \end{align*} \begin{align*} \mathbb{E}(X)\mathbb{E}(Y) &= \l qx_1 + (1-q)x_2 \r \l p y_1 + (1-p)y_2\r \\ &= qpx_1y_1 + q(1-p)x_1y_2 + (1-q)px_2y_1 + (1-q)(1-p)x_2y_2 \\ \mathbb{E}(XY) &= ax_1y_1 + (q-a)x_1y_2 + (p-a)x_2y_1 + (1 + a - p - q)x_2y_2 &= \end{align*} Therefore \(\mathbb{E}(XY) - \mathbb{E}(X)\mathbb{E}(Y)\) is a degree 2 polynomial in the \(x_i, y_i\). If \(x_1 = x_2\) then we have: \begin{align*} \mathbb{E}(X)\mathbb{E}(Y) &=x_1 \l p y_1 + (1-p)y_2\r \\ \mathbb{E}(XY) &= x_1(ay_1 + (q-a)y_2 + (p-a)y_1 + (1 + a - p - q)y_2) \\ &= x_1 (py_1 + (1-p)y_2) \end{align*} Therefore \(x_1 - x_2\) is a root and by symmetry \(y_1 - y_2\) is a root. Therefore it remains to check the coefficient of \(x_1y_1\) which is \(a - pq\) to complete the factorisation. For any two random variables taking two distinct values, we can find \(a, q, p\) satisfying the relations above. We also note that \(X\) and \(Y\) are independent if \(\mathbb{P}(X = x_i, Y = y_i) = \mathbb{P}(X = x_i)\mathbb{P}(Y = y_i)\). Since \(x_1 \neq x_2\) and \(y_1 \neq y_2\) and \(\E(A B) = \E( A)\E (B) \Rightarrow a = pq\). But if \(a = pq\), we have \(\mathbb{P}(X = x_1, Y = y_1) = \mathbb{P}(X = x_1)\mathbb{P}(Y = y_1)\) and all the other relations drop out similarly. Consider \begin{align*} \mathbb{P}(A = -1, B = 1) &= \frac{1}{6} \\ \mathbb{P}(A = -1, B = -1) &= \frac{1}{6} \\ \mathbb{P}(A = 0, B = 0) &= \frac{1}{3} \\ \mathbb{P}(A = 1, B = -1) &= \frac{1}{6} \\ \mathbb{P}(A = -1, B = -1) &= \frac{1}{6} \end{align*}

1999 Paper 3 Q12
D: 1700.0 B: 1500.0

In the game of endless cricket the scores \(X\) and \(Y\) of the two sides are such that \[ \P (X=j,\ Y=k)=\e^{-1}\frac{(j+k)\lambda^{j+k}}{j!k!},\] for some positive constant \(\lambda\), where \(j,k = 0\), \(1\), \(2\), \(\ldots\).

  1. Find \(\P(X+Y=n)\) for each \(n>0\).
  2. Show that \(2\lambda \e^{2\lambda-1}=1\).
  3. Show that \(2x \e^{2x-1}\) is an increasing function of \(x\) for \(x>0\) and deduce that the equation in (ii) has at most one solution and hence determine \(\lambda\).
  4. Calculate the expectation \(\E(2^{X+Y})\).


Solution:

  1. \begin{align*} && \mathbb{P}(X+Y = n) &= \sum_{i = 0}^n \mathbb{P}(X = i, Y = n-i) \\ &&&= \sum_{i = 0}^n e^{-1} \frac{n \lambda^n}{i! (n-i)!} \\ &&&=e^{-1} n \lambda^n \sum_{i = 0}^n\frac{1}{i! (n-i)!} \\ &&&=\frac{e^{-1} n}{n!} \lambda^n \sum_{i = 0}^n\frac{n!}{i! (n-i)!} \\ &&&= \frac{n\lambda^n}{e n!} 2^n \\ &&&= \frac{n (2 \lambda)^n}{e \cdot n!} \end{align*}
  2. \begin{align*} && 1 &= \sum_{n = 0}^{\infty} \mathbb{P}(X+Y =n ) \\ &&&= \sum_{n = 0}^{\infty}\frac{n (2 \lambda)^n}{e \cdot n!} \\ &&&= \sum_{n = 1}^\infty \frac{ (2 \lambda)^n}{e \cdot (n-1)!} \\ &&&= \frac{2 \lambda}{e}\sum_{n = 0}^\infty \frac{ (2 \lambda)^n}{n!} \\ &&&= \frac{2 \lambda}{e} e^{2\lambda} \\ &&&= 2 \lambda e^{2\lambda - 1} \end{align*} \\
  3. Consider \(f(x) = 2xe^{2x-1}\), then \begin{align*} && f'(x) &= 2e^{2x-1} + 2xe^{2x-1} \cdot 2 \\ &&&= e^{2x-1} (2 + 4x) > 0 \end{align*} Therefore \(f(x)\) is an increasing function of \(x\), which means \(f(x) = 1\) has at most one solution for \(\lambda\). Therefore \(\lambda = \frac12\)
  4. \begin{align*} \mathbb{E}(2^{X+Y}) &= \sum_{n = 0}^\infty \mathbb{P}(X+Y = n) 2^n \\ &= \sum_{n = 1}^\infty \frac{1}{e(n-1)!} 2^{n} \\ &= \frac{2}{e} \sum_{n=0}^\infty \frac{2^n}{n!} \\ &= \frac{2}{e} e^2 \\ &= 2e \end{align*}

1991 Paper 3 Q16
D: 1700.0 B: 1504.3

The random variables \(X\) and \(Y\) take integer values \(x\) and \(y\) respectively which are restricted by \(x\geqslant1,\) \(y\geqslant1\) and \(2x+y\leqslant2a\) where \(a\) is an integer greater than 1. The joint probability is given by \[ \mathrm{P}(X=x,Y=y)=c(2x+y), \] where \(c\) is a positive constant, within this region and zero elsewhere. Obtain, in terms of \(x,c\) and \(a,\) the marginal probability \(\mathrm{P}(X=x)\) and show that \[ c=\frac{6}{a(a-1)(8a+5)}. \] Show that when \(y\) is an even number the marginal probability \(\mathrm{P}(Y=y)\) is \[ \frac{3(2a-y)(2a+2+y)}{2a(a-1)(8a+5)} \] and find the corresponding expression when \(y\) is off. Evaluate \(\mathrm{E}(Y)\) in terms of \(a\).

1990 Paper 3 Q15
D: 1700.0 B: 1482.6

An unbiased twelve-sided die has its faces marked \(A,A,A,B,B,B,B,B,B,B,B,B.\) In a series of throws of the die the first \(M\) throws show \(A,\) the next \(N\) throws show \(B\) and the \((M+N+1)\)th throw shows \(A\). Write down the probability that \(M=m\) and \(N=n\), where \(m\geqslant0\) and \(n\geqslant1.\) Find

  1. the marginal distributions of \(M\) and \(N\),
  2. the mean values of \(M\) and \(N\).
Investigate whether \(M\) and \(N\) are independent. Find the probability that \(N\) is greater than a given integer \(k\), where \(k\geqslant1,\) and find \(\mathrm{P}(N > M).\) Find also \(\mathrm{P}(N=M)\) and show that \(\mathrm{P}(N < M)=\frac{1}{52}.\)


Solution: \begin{align*} \mathbb{P}(M = m, N = n) &= \left ( \frac{3}{12} \right)^m \left ( \frac{9}{12} \right)^n \frac{3}{12} \\ &= \frac{3^n}{4^{m+n+1}} \end{align*}

  1. \begin{align*} \mathbb{P}(M = m) &= \sum_{n = 1}^{\infty} \mathbb{P}(M=m,N=n) \\ &= \sum_{n = 1}^{\infty} \frac{3^n}{4^{m+n+1}} \\ &= \frac{1}{4^{m+1}} \sum_{n = 1}^{\infty} \left ( \frac34\right)^n \\ &= \frac{1}{4^{m+1}} \frac{3/4}{1/4} \\ &= \frac{3}{4^{m+1}} \\ \\ \mathbb{P}(N = n) &= \sum_{m = 0}^{\infty} \mathbb{P}(M=m,N=n) \\ &= \sum_{m = 0}^{\infty} \frac{3^n}{4^{m+n+1}} \\ &= \frac{3^n}{4^{n+1}} \sum_{m = 0}^{\infty} \left ( \frac14\right)^n \\ &= \frac{3^n}{4^{n+1}} \frac{1}{3/4} \\ &= \frac{3^{n-1}}{4^{n}} \\ \end{align*}
  2. \(M+1 \sim Geo(\frac34) \Rightarrow \mathbb{E}(M) = \frac43 -1 = \frac13\) \(N \sim Geo(\frac14) \Rightarrow \mathbb{E}(N) = 4\)
\(M,N\) are independent since \(\mathbb{P}(M = m, N =n ) = \mathbb{P}(M=m)\mathbb{P}(N=n)\) \begin{align*} \mathbb{P}(N > k) &= \sum_{n=k+1}^{\infty} \mathbb{P}(N = n) \\ &= \sum_{n=k+1}^{\infty} \frac{3^{n-1}}{4^{n}} \\ &= \frac{3^k}{4^{k+1}} \sum_{n = 0}^{\infty} \left ( \frac34\right)^n \\ &= \frac{3^k}{4^{k+1}} \frac{1}{1/4} \\ &= \frac{3^k}{4^k} \end{align*} \begin{align*} \mathbb{P}(N > M) &= \sum_{m=0}^{\infty} \mathbb{P}(N > m) \mathbb{P}(M = m) \\ &= \sum_{m=0}^{\infty} \left (\frac34 \right)^m \frac{3}{4^{m+1}}\\ &=\sum_{m=0}^{\infty} \frac{3^{m+1}}{4^{2m+1}}\\ &= \frac{3}{4} \frac{1}{13/16} \\ &= \frac{12}{13} \\ \\ \mathbb{P}(N=M) &= \sum_{m=1}^{\infty} \mathbb{P}(N=m, M=m) \\ &= \sum_{m=1}^{\infty} \frac{3^m}{4^{2m+1}} \\ &= \frac{3}{64} \sum_{m=0}^{\infty} \left ( \frac{3}{16} \right)^m \\ &= \frac{3}{64} \frac{1}{13/16} \\ &= \frac{3}{52}\\ \\ \mathbb{P}(N < M) &= 1 - \frac34 - \frac3{52} \\ &= 1 - \frac{48}{52} - \frac{3}{52} \\ &= 1 - \frac{51}{52} \\ &= \frac{1}{52} \end{align*}