A particle \(P_1\) is projected with speed \(V\) at an angle of elevation \({\alpha}\,\,\,( > 45^{\circ})\,,\,\,\,\) from a point in a horizontal plane. Find \(T_1\), the flight time of \(P_1\), in terms of \({\alpha}, V \hbox{ and } g\,\). Show that the time after projection at which the direction of motion of \(P_1\) first makes an angle of \(45^{\circ}\) with the horizontal is \(\frac12 (1-\cot \alpha)T_1\,\). A particle \(P_2\) is projected under the same conditions. When the direction of the motion of \(P_2\) first makes an angle of \(45^{\circ}\) with the horizontal, the speed of \(P_2\) is instantaneously doubled. If \(T_2\) is the total flight time of \(P_2\), show that $$ \frac{2T_2}{T_1} = 1+\cot{\alpha} +\sqrt{1+3\cot^2{\alpha}} \;. $$
The life of a certain species of elementary particles can be described as follows. Each particle has a life time of \(T\) seconds, after which it disintegrates into \(X\) particles of the same species, where \(X\) is a random variable with binomial distribution \(\mathrm{B}(2,p)\,\). A population of these particles starts with the creation of a single such particle at \(t=0\,\). Let \(X_n\) be the number of particles in existence in the time interval \(nT < t < (n+1)T\,\), where \(n=1\,\), \(2\,\), \(\ldots\). Show that \(\P(X_1=2 \mbox { and } X_2=2) = 6p^4q^2\;\), where \(q=1-p\,\). Find the possible values of \(p\) if it is known that \(\P(X_1=2 \vert X_2=2) =9/25\,\). Explain briefly why \(\E(X_n) =2p\E(X_{n-1})\) and hence determine \(\E(X_n)\) in terms of \(p\). Show that for one of the values of \(p\) found above \(\lim_{n \to \infty}\E(X_n) = 0\) and that for the other \(\lim_{n \to \infty}\E(X_n) = + \infty\,\).
Solution: Notice that we can see the total number generated as \(X_n \sim B(2X_{n-1},p)\), since a Binomial is a sum of independent Bernoullis, and there are two Bernoullis per particle. \begin{align*} && \mathbb{P}(X_1=2 \mbox { and } X_2=2) &= \underbrace{p^2}_{\text{two generated in first iteration}} \cdot \underbrace{\binom{4}{2}p^2q^2}_{\text{two generated from the first two}} \\ &&&= 6p^4q^2 \end{align*} \begin{align*} && \mathbb{P})(X_1 = 2 |X_2 = 2) &= \frac{ \mathbb{P}(X_1=2 \mbox { and } X_2=2) }{ \mathbb{P}( X_2=2) } \\ &&&= \frac{6p^4q^2}{6p^4q^2+2pq \cdot p^2} \\ &&&= \frac{3pq}{3pq+1} \\ \Rightarrow && \frac{9}{25} &= \frac{3pq}{3pq+1} \\ \Rightarrow && 27pq + 9 &= 75pq \\ \Rightarrow && 9 &= 48pq \\ \Rightarrow && pq &= \frac{3}{16} \\ \Rightarrow && 0 &= p^2 - p + \frac3{16} \\ \Rightarrow && p &= \frac14, \frac34 \end{align*} By the same reasoning about the Bernoullis, we must have \(\E[X_n] = \E[\E[X_n | X_{n-1}]] = \E[2pX_{n-1}] = 2p \E[X_{n-1}]\) therefore \(\E[X_n] = (2p)^n\). If \(p = \frac14\) then \(\E[X_n] = \frac1{2^n} \to 0\) If \(p = \frac34\) then \(\E[X_n] = \left(\frac32 \right)^n \to \infty\)
The random variable \(X\) takes the values \(k=1\), \(2\), \(3\), \(\dotsc\), and has probability distribution $$ \P(X=k)= A{{{\lambda}^k\e^{-{\lambda}}} \over {k!}}\,, $$ where \(\lambda \) is a positive constant. Show that \(A = (1-\e^{-\lambda})^{-1}\,\). Find the mean \({\mu}\) in terms of \({\lambda}\) and show that $$ \var(X) = {\mu}(1-{\mu}+{\lambda})\;. $$ Deduce that \({\lambda} < {\mu} < 1+{\lambda}\,\). Use a normal approximation to find the value of \(P(X={\lambda})\) in the case where \({\lambda}=100\,\), giving your answer to 2 decimal places.
Solution: Let \(Y \sim Po(\lambda)\) \begin{align*} && 1 &= \sum_{k=1}^\infty \mathbb{P}(X = k ) \\ &&&= \sum_{k=1}^\infty A \frac{\lambda^k e^{-\lambda}}{k!}\\ &&&= Ae^{-\lambda} \sum_{k=1}^{\infty} \frac{\lambda^k e^{-\lambda}}{k!} \\ &&&= Ae^{-\lambda} \left (e^{\lambda}-1 \right) \\ \Rightarrow && A &= (1-e^{-\lambda})^{-1} \\ \\ && \E[X] &= \sum_{k=1}^{\infty} k \cdot \mathbb{P}(X=k) \\ &&&= A\sum_{k=1}^{\infty} k \frac{\lambda^k e^{-\lambda}}{k!} \\ &&&= A\E[Y] = A\lambda = \lambda(1-e^{-\lambda})^{-1} \\ \\ && \var[X] &= \E[X^2] - (\E[X])^2 \\ &&&= A\sum_{k=1}^{\infty} k^2 \frac{\lambda^k e^{-\lambda}}{k!} - \mu^2 \\ &&&= A\E[Y^2] - \mu^2 \\ &&&= A(\var[Y]+\lambda^2) - \mu^2 \\ &&&= A(\lambda + \lambda^2) - \mu^2 \\ &&&= A\lambda(1+\lambda) - \mu^2 \\ &&&= \mu(1+\lambda - \mu) \end{align*} Since \(A > 1\) we must have \(\mu > \lambda\) and since \(\var[X] > 0\) we must have \(1 + \lambda > \mu\) as required. If \(\lambda = 100\), then \(A \approx 1\) and \(P(X=\lambda) \approx P(Y = \lambda)\) and \(Y \approx N(\lambda, \lambda)\) so the value is approximately \(\displaystyle \int_{-\frac12}^{\frac12} \frac{1}{\sqrt{2\pi \lambda}} e^{-\frac{x^2}{2\lambda}} \d x \approx \frac{1}{\sqrt{200\pi}} = \frac{1}{\sqrt{630.\ldots}} \approx \frac{1}{25} = 0.04 \)
The probability of throwing a 6 with a biased die is \(p\,\). It is known that \(p\) is equal to one or other of the numbers \(A\) and \(B\) where \(0 < A < B < 1 \,\). Accordingly the following statistical test of the hypothesis \(H_0: \,p=B\) against the alternative hypothesis \(H_1: \,p=A\) is performed. The die is thrown repeatedly until a 6 is obtained. Then if \(X\) is the total number of throws, \(H_0\) is accepted if \(X \le M\,\), where \(M\) is a given positive integer; otherwise \(H_1\) is accepted. Let \({\alpha}\) be the probability that \(H_1\) is accepted if \(H_0\) is true, and let \({\beta}\) be the probability that \(H_0\) is accepted if \(H_1\) is true. Show that \({\beta} = 1- {\alpha}^K,\) where \(K\) is independent of \(M\) and is to be determined in terms of \(A\) and \(B\,\). Sketch the graph of \({\beta}\) against \({\alpha}\,\).
Solution: \(X \sim Geo(p)\). \(\alpha = \mathbb{P}(X > M | p = B) = (1-B)^{M}\) \(\beta = \mathbb{P}(X \leq M | p = A) = 1 - \mathbb{P}(X > M | p = A) = 1 - (1-A)^{M}\) \begin{align*} \ln \alpha &= M \ln(1-B) \\ \ln (1-\beta) &= M \ln(1-A) \\ \frac{\ln \alpha}{\ln (1-\beta)} &= \frac{\ln(1-B)}{\ln(1-A)} \\ \ln(1-\beta) &= \ln \alpha \frac{\ln (1-A)}{\ln(1-B)} \\ \beta &= 1- \alpha^{ \frac{\ln (1-A)}{\ln(1-B)} } \end{align*} and \(K = \frac{\ln (1-A)}{\ln(1-B)} \) Since \(0 < A < B < 1\) we must have that \(0 < 1 - B < 1-A < 1\) and \(\ln(1-B) < \ln(1-A) < 0\) so \(0 < K < 1\)