Problems

Filters
Clear Filters
2018 Paper 3 Q11
D: 1700.0 B: 1487.9

A particle is attached to one end of a light inextensible string of length \(b\). The other end of the string is attached to a fixed point \(O\). Initially the particle hangs vertically below \(O\). The particle then receives a horizontal impulse. The particle moves in a circular arc with the string taut until the acute angle between the string and the upward vertical is \(\alpha\), at which time it becomes slack. Express \(V\), the speed of the particle when the string becomes slack, in terms of \( b\), \(g\) and \(\alpha\). Show that the string becomes taut again a time \(T\) later, where \[ gT = 4V \sin\alpha \,,\] and that just before this time the trajectory of the particle makes an angle \(\beta \) with the horizontal where \(\tan\beta = 3\tan \alpha \,\). When the string becomes taut, the momentum of the particle in the direction of the string is destroyed. Show that the particle comes instantaneously to rest at this time if and only if \[ \sin^2\alpha = \dfrac {1+\sqrt3}4 \,. \]


Solution:

TikZ diagram
\begin{align*} \text{N2}(\swarrow): &&T +mg \cos \alpha &= m \frac{V^2}{b} \\ \end{align*} So the string goes slack when \(bg\cos \alpha = V^2 \Rightarrow V = \sqrt{bg \cos \alpha}\). Once the string goes slack, the particle moves as a projectile. It's initial speed is \(V\binom{-\cos \alpha}{\sin \alpha}\) and it's position is \(\binom{b\sin \alpha}{b\cos \alpha}\): \begin{align*} && \mathbf{s} &= \binom{b\sin \alpha}{b\cos \alpha}+Vt \binom{-\cos \alpha}{\sin \alpha} + \frac12 gt^2 \binom{0}{-1} \\ &&&= \binom{b\sin \alpha - Vt \cos \alpha}{b\cos \alpha + Vt \sin \alpha - \frac12 gt^2} \\ |\mathbf{s}|^2 = b^2 \Rightarrow && b^2 &= \left ( \binom{b\sin \alpha}{b\cos \alpha}+Vt \binom{-\cos \alpha}{\sin \alpha} + \frac12 gt^2 \binom{0}{-1} \right)^2 \\ &&&= b^2 + V^2t^2+\frac14 g^2 t^4 -gb\cos \alpha t^2-V\sin \alpha gt^3 \\ \Rightarrow && 0 &= V^2t^2 + \frac14 g^2 t^4 - V^2 t^2- V \sin \alpha g t^3 \\ &&&= \frac14 g^2 t^4 - V \sin \alpha gt^3 \\ \Rightarrow && gT &= 4V \sin \alpha \end{align*} The particle will have velocity \(\displaystyle \binom{-V \cos \alpha}{V \sin \alpha - 4V \sin \alpha} = \binom{-V \cos \alpha}{-3V \sin \alpha}\) so the angle \(\beta\) will satisfy \(\tan \beta = 3 \tan \alpha\). The particle will come to an instantaneous rest if all the momentum is destroyed, ie if the particle is travelling parallel to the string. \begin{align*} && 3 \tan \alpha &= \frac{b\cos \alpha + Vt \sin \alpha - \frac12 gt^2}{b\sin \alpha - Vt \cos \alpha} \\ &&&= \frac{\frac{V^2}{g}+\frac{4V^2\sin^2\alpha}{g} - \frac{8V^2\sin^2 \alpha}{g}}{\frac{V^2\sin \alpha}{g \cos \alpha} - \frac{4V^2 \sin \alpha \cos \alpha}{g}} \\ &&&= \frac{1 -4\sin^2 \alpha}{\tan \alpha(1 - 4\cos^2 \alpha)} \\ \Leftrightarrow&& 3 \frac{\sin^2 \alpha}{1-\sin^2 \alpha} &= \frac{1- 4 \sin^2 \alpha}{-3+4\sin^2 \alpha} \\ \Leftrightarrow && -9 \sin^2 \alpha + 12 \sin^4 \alpha &= 1 - 5 \sin^2 \alpha + 4 \sin^4 \alpha \\ \Leftrightarrow && 0 &= 1+4 \sin^2 \alpha - 8\sin^4 \alpha \\ \Leftrightarrow && \sin^2 \alpha &= \frac{1 + \sqrt{3}}4 \end{align*} (taking the only positive root)

2018 Paper 3 Q12
D: 1700.0 B: 1516.0

A random process generates, independently, \(n\) numbers each of which is drawn from a uniform (rectangular) distribution on the interval 0 to 1. The random variable \(Y_k\) is defined to be the \(k\)th smallest number (so there are \(k-1\) smaller numbers).

  1. Show that, for \(0\le y\le1\,\), \[ {\rm P}\big(Y_k\le y) =\sum^{n}_{m=k}\binom{n}{m}y^{m}\left(1-y\right)^{n-m} . \tag{\(*\)} \]
  2. Show that \[ m\binom n m = n \binom {n-1}{m-1} \] and obtain a similar expression for \(\displaystyle (n-m) \, \binom n m\,\). Starting from \((*)\), show that the probability density function of \(Y_k\) is \[ n\binom{ n-1}{k-1} y^{k-1}\left(1-y\right)^{ n-k} \,.\] Deduce an expression for \(\displaystyle \int_0^1 y^{k-1}(1-y)^{n-k} \, \d y \,\).
  3. Find \(\E(Y_k) \) in terms of \(n\) and \(k\).


Solution:

  1. \begin{align*} && \mathbb{P}(Y_k \leq y) &= \sum_{j=k}^n\mathbb{P}(\text{exactly }j \text{ values less than }y) \\ &&&= \sum_{j=k}^m \binom{m}{j} y^j(1-y)^{n-j} \end{align*}
  2. This is the number of ways to choose a committee of \(m\) people with the chair from those \(m\) people. This can be done in two ways. First: choose the committee in \(\binom{n}{m}\) ways and choose the chair in \(m\) ways so \(m \binom{n}{m}\). Alternatively, choose the chain in \(n\) ways and choose the remaining \(m-1\) committee members in \(\binom{n-1}{m-1}\) ways. Therefore \(m \binom{n}{m} = n \binom{n-1}{m-1}\) \begin{align*} (n-m) \binom{n}{m} &= (n-m) \binom{n}{n-m} \\ &= n \binom{n-1}{n-m-1} \\ &= n \binom{n-1}{m} \end{align*} \begin{align*} f_{Y_k}(y) &= \frac{\d }{\d y} \l \sum^{n}_{m=k}\binom{n}{m}y^{m}\left(1-y\right)^{n-m} \r \\ &= \sum^{n}_{m=k} \l \binom{n}{m}my^{m-1}\left(1-y\right)^{n-m} -\binom{n}{m}(n-m)y^{m}\left(1-y\right)^{n-m-1} \r \\ &= \sum^{n}_{m=k} \l n \binom{n-1}{m-1}y^{m-1}\left(1-y\right)^{n-m} -n \binom{n-1}{m} y^{m}\left(1-y\right)^{n-m-1} \r \\ &= n\sum^{n}_{m=k} \binom{n-1}{m-1}y^{m-1}\left(1-y\right)^{n-m} -n\sum^{n+1}_{m=k+1} \binom{n-1}{m-1} y^{m-1}\left(1-y\right)^{n-m} \\ &= n \binom{n-1}{k-1} y^{k-1}(1-y)^{n-k} \end{align*} \begin{align*} &&1 &= \int_0^1 f_{Y_k}(y) \d y \\ &&&= \int_0^1 n \binom{n-1}{k-1} y^{k-1}(1-y)^{n-k} \d y \\ &&&= n \binom{n-1}{k-1} \int_0^1 y^{k-1}(1-y)^{n-k} \d y \\ \Rightarrow && \frac{1}{n \binom{n-1}{k-1}} &= \int_0^1 y^{k-1}(1-y)^{n-k} \d y \\ \end{align*}
  3. \begin{align*} && \mathbb{E}(Y_k) &= \int_0^1 y f_{Y_k}(y) \d y \\ &&&= \int_0^1 n \binom{n-1}{k-1} y^{k}(1-y)^{n-k} \\ &&&= n \binom{n-1}{k-1}\int_0^1 y^{k}(1-y)^{n-k} \d y \\ &&&= n \binom{n-1}{k-1}\int_0^1 y^{k+1-1}(1-y)^{n+1-(k+1)} \d y \\ &&&= n \binom{n-1}{k-1} \frac{1}{(n+1) \binom{n}{k}}\\ &&&= \frac{n}{n+1} \cdot \frac{k}{n} \\ &&&= \frac{k}{n+1} \end{align*}

2018 Paper 3 Q13
D: 1700.0 B: 1484.0

The random variable \(X\) takes only non-negative integer values and has probability generating function \(\G(t)\). Show that \[ \P(X = 0 \text{ or } 2 \text{ or } 4 \text { or } 6 \ \ldots ) = \frac{1}{2}\big(\G\left(1\right)+\G\left(-1\right)\big). \] You are now given that \(X\) has a Poisson distribution with mean \(\lambda\). Show that \[ \G(t) = \e^{-\lambda(1-t)} \,. \]

  1. The random variable \(Y\) is defined by \[ \P(Y=r)= \begin{cases} k\P(X=r) & \text{if \(r=0, \ 2, \ 4, \ 6, \ \ldots\) \ }, \\[2mm] 0& \text{otherwise}, \end{cases} \] where \(k\) is an appropriate constant. Show that the probability generating function of \(Y\) is \(\dfrac{\cosh\lambda t}{\cosh\lambda}\,\). Deduce that \(\E(Y) < \lambda\) for \(\lambda > 0\,\).
  2. The random variable \(Z\) is defined by \[\P(Z=r)= \begin{cases} c \P(X=r) & \text{if \(r = 0, \ 4, \ 8, \ 12, \ \ldots \ \)}, \\[2mm] 0& \text{otherwise,} \end{cases} \] where \(c\) is an appropriate constant. Is \(\E(Z) < \lambda\) for all positive values of \(\lambda\,\)?


Solution: \begin{align*} &&G_X(t) &= \mathbb{E}(t^N) \\ &&&= \sum_{k=0}^{\infty} \mathbb{P}(X = k) t^k \\ \Rightarrow && G_X(1) &= \sum_{k=0}^{\infty} \mathbb{P}(X = k) \\ \Rightarrow && G_X(-1) &= \sum_{k=0}^{\infty} (-1)^k\mathbb{P}(X = k) \\ \Rightarrow && \frac12 (G_X(1) + G_X(-1) &= \sum_{k=0}^{\infty} \frac12 (1 + (-1)^k) \mathbb{P}(X = k) \\ &&&= \sum_{k=0}^{\infty} \mathbb{P}(X =2k) \end{align*}

  1. \begin{align*} 1 &= \sum_r \mathbb{P}(Y = r) \\ &= \sum_{k=0}^\infty k \cdot \mathbb{P}(X = 2k) \\ &= k \cdot \frac12 \l e^{-\lambda(1-1) } + e^{-\lambda(1+1) }\r \\ &= \frac{k}{2}(1+e^{-2\lambda}) \end{align*} Therefore \(k = \frac{2}{1+e^{-2\lambda}} = e^{\lambda} \frac{1}{\cosh \lambda}\) \begin{align*} && G_X(t) + G_X(-t) &= \sum_{k=0}^\infty \mathbb{P}(X = k)t^k(1^k + (-1)^k) \\ &&&= \sum_{k=0}^\infty \mathbb{P}(X = k)t^k(1^k + (-1)^k) \\ &&&= 2\sum_{k=0}^\infty \mathbb{P}(X = 2k)t^{2k} \\ &&&= 2\sum_{k=0}^\infty \frac{1}{k}\mathbb{P}(Y = 2k)t^{2k} \\ &&&= \frac{2}{k}G_Y(t) \\ \Rightarrow && G_Y(t) &= k \cdot \frac{G_X(t) + G_X(-t)}{2} \\ &&&= k\frac{e^{-\lambda(1-t)} + e^{-\lambda(1+t)}}{2} \\ &&&= \frac{e^\lambda}{\cosh \lambda} \frac{e^{-\lambda} (e^{\lambda t}+e^{-\lambda t}) }{2} \\ &&&= \frac{\cosh \lambda t}{\cosh \lambda} \end{align*} Since \(\mathbb{E}(Y) = G_Y'(1)\) and \begin{align*} && G_Y'(t) &= \frac{\lambda \sinh \lambda t}{\cosh \lambda t} \\ \Rightarrow && G_Y'(1) &= \lambda \tanh \lambda \\ &&&< \lambda \end{align*} since \(\tanh x < 1\)
  2. \begin{align*} && \frac14 \l G_X(t) + G_X(it) +G_X(-t) + G_X(-it) \r &= \sum_{k=0}^\infty \mathbb{P}(X=k)t^k (1 + i^k + (-1)^k + (-i)^k) \\ &&&= \sum_{k=0}^\infty \mathbb{P}(X = 4k)t^{4k} \\ &&&= \frac{G_Z(t)}{c} \end{align*} Since \(G_Z(1) = 1\) we must have \(c = \frac1{\frac14 \l G_X(1) + G_X(i) +G_X(-1) + G_X(-i) \r}\) \begin{align*} && c &= \frac{4e^{\lambda}}{e^{\lambda} + e^{-\lambda} + e^{i\lambda} + e^{-i\lambda}} \\ &&&= \frac{2e^{\lambda}}{\cosh \lambda + \cos \lambda} \\ && G_Z(t) &= c \cdot \frac14 \l e^{-\lambda(1-t)}+e^{-\lambda(1-it)}+e^{-\lambda(1+t)}+e^{-\lambda(1+it)} \r \\ &&&= \frac{ce^{-\lambda t}}{4} \l 2\cosh \lambda t + 2 \cos \lambda t\r \\ &&&= \frac{\cosh \lambda t + \cos \lambda t}{\cosh \lambda + \cos \lambda} \end{align*} We are interested in \(G_Z'(1)\) so: \begin{align*} && G_Z'(t) &= \frac{\lambda (\sinh \lambda t - \sin \lambda t)}{\cosh \lambda + \cos \lambda } \end{align*} Considering various values of \(\lambda\), it makes sense to look at \(\lambda = \pi\) (since \(\cos \lambda = -1\) and the denominator will be small). From this we can see: \begin{align*} G'_Z(1) &= \frac{\pi (\sinh \pi-0)}{\cosh \pi-1} \\ &= \frac{\pi}{\tanh \frac{\pi}{2}} > \pi \end{align*} So \(\mathbb{E}(Z)\) is larger than \(\lambda\) for \(\lambda = \pi\) (and probably many others)