Problems

Filters
Clear Filters
2016 Paper 3 Q5
D: 1700.0 B: 1500.0

  1. By considering the binomial expansion of \((1+x)^{2m+1}\), prove that \[ \binom{ 2m \! +\! 1}{ m} < 2^{2m}\,, \] for any positive integer \(m\).
  2. For any positive integers \(r\) and \(s\) with \(r< s\), \(P_{r,s}\) is defined as follows: \(P_{r,s}\) is the product of all the prime numbers greater than \(r\) and less than or equal to \(s\), if there are any such primes numbers; if there are no such primes numbers, then \(P_{r,s}=1\,\). For example, \(P_{3,7}=35\), \(P_{7,10}=1\) and \(P_{14,18}=17\). Show that, for any positive integer \(m\), \(P_{m+1\,,\, 2m+1} \) divides \(\displaystyle \binom{ 2m \! +\! 1}{ m} \,,\) and deduce that \[ P_{m+1\,,\, 2m+1} < 2^{2m} \,. \]
  3. Show that, if \(P_{1,k} < 4^k\) for \(k = 2\), \(3\), \(\ldots\), \(2m\), then \( P_{1,2m+1} < 4^{2m+1}\,\).
  4. Prove that \(\P_{1,n} < 4^n\) for \(n\ge2\).


Solution:

  1. Notice that \((1+x)^{2m+1} = 1+\binom{2m+1}{1}x+\cdots + \binom{2m+1}{m}x^{m} + \binom{2m+1}{m+1} + \cdots\). Notice also that \(\binom{2m+1}{m} = \binom{2m+1}{m+1}\). Therefore evaluating at \(x = 1\), we see \(2^{2m+1} > \binom{2m+1}{m} + \binom{2m+1}{m+1} = 2 \binom{2m+1}{m} \Rightarrow \binom{2m+1}{m} < 2^{2m}\)
  2. Each prime dividing \(P_{m+1, 2m+1}\) divides the numerator of \(\binom{2m+1}{m}\) since it appears in \((2m+1)!\), but not the denominator, since they wont appear in \(m!\) or \((m+1)!\), and since they are prime they have to appear to divide it. Therefore the must divide \(\binom{2m+1}{m}\) and therefore \(P_{m+1,2m+1}\) must divide that binomail coefficient. Since \(a \mid b \Rightarrow a \leq b\) we must have \(P_{m+1, 2m+1} \leq \binom{2m+1}{m} < 2^{2m}\)
  3. Since \begin{align*} P_{1,2m+1} &= P_{1,m+1}P_{m+1, 2m+1} \tag{split into primes below \(m+1\) and abvoe} \\ &< 4^{m+1}P_{m+1,2m+1} \tag{use the condition from the question}\\ &<4^{m+1}2^{2m} \tag{use our inequality} \\ &= 4^{2m+1} \end{align*}
  4. We proceed by (strong) induction. Base case: (\(n = 2\)): \(P_{1,2} = 2 < 4^2 =16\) Suppose it is true for all \(k=2,3,\cdots,2m\) then it is true for \(k=2m+1\) by the previous part of the question. However it is also true for \(k=2m+2\), since that can never be prime (as n is now an even number bigger than 2). Therefore by the principle of mathematical induction it is true for all \(n\).

2016 Paper 3 Q6
D: 1700.0 B: 1484.0

Show, by finding \(R\) and \(\gamma\), that \(A \sinh x + B\cosh x \) can be written in the form \(R\cosh (x+\gamma)\) if \(B>A>0\). Determine the corresponding forms in the other cases that arise, for \(A>0\), according to the value of \(B\). Two curves have equations \(y = \sech x\) and \(y = a\tanh x + b\,\), where \(a>0\).

  1. In the case \(b>a\), show that if the curves intersect then the \(x\)-coordinates of the points of intersection can be written in the form \[ \pm\arcosh \left( \frac 1 {\sqrt{b^2-a^2}}\right) - {\rm artanh \,} \frac a b .\]
  2. Find the corresponding result in the case \(a>b>0\,\).
  3. Find necessary and sufficient conditions on \(a\) and \(b\) for the curves to intersect at two distinct points.
  4. Find necessary and sufficient conditions on \(a\) and \(b\) for the curves to touch and, given that they touch, express the \(y\)-coordinate of the point of contact in terms of \(a\).

2016 Paper 3 Q7
D: 1700.0 B: 1516.0

Let \(\omega = \e^{2\pi {\rm i}/n}\), where \(n\) is a positive integer. Show that, for any complex number \(z\), \[ (z-1)(z-\omega) \cdots (z - \omega^{n-1}) = z^n -1\,. \] The points \(X_0\), \(X_1\), \ldots\,, \(X_{n-1}\) lie on a circle with centre \(O\) and radius 1, and are the vertices of a regular polygon.

  1. The point \(P\) is equidistant from \(X_0\) and \(X_1\). Show that, if \(n\) is even, \[ |PX_0| \times |PX_1 |\times \,\cdots\, \times |PX_{n-1}| = |OP|^n +1\, ,\] where \(|PX_ k|\) denotes the distance from \(P\) to \(X_k\). Give the corresponding result when \(n\) is odd. (There are two cases to consider.)
  2. Show that \[ |X_0 X_1|\times |X_0 X_2|\times \,\cdots\, \times |X_0 X_{n-1}| =n\,. \]

2016 Paper 3 Q8
D: 1700.0 B: 1484.0

  1. The function f satisfies, for all \(x\), the equation \[ \f(x) + (1- x)\f(-x) = x^2\, . \] Show that \(\f(-x) + (1 + x)\f(x) = x^2\,\). Hence find \(\f(x)\) in terms of \(x\). You should verify that your function satisfies the original equation.
  2. The function \({\rm K}\) is defined, for \(x\ne 1\), by \[{\rm K}(x) = \dfrac{x+1}{x-1}\,.\] Show that, for \(x\ne1\), \({\rm K(K(}x)) =x\,\). The function g satisfies the equation \[ \g(x)+ x\, \g\Big(\frac{ x+1 }{x-1}\Big) = x \ \ \ \ \ \ \ \ \ \ \ ( x\ne 1) \,. \] Show that, for \(x\ne1\), \(\g(x)= \dfrac{2x}{x^2+1}\,\).
  3. Find \(\h(x)\), for \(x\ne0\), \(x\ne1\), given that \[ \h(x)+ \h\Big(\frac 1 {1-x}\Big)= 1-x -\frac1{1-x} \ \ \ \ \ \ ( x\ne0, \ \ x\ne1 ) \,. \]

2016 Paper 3 Q9
D: 1700.0 B: 1475.6

Three pegs \(P\), \(Q\) and \(R\) are fixed on a smooth horizontal table in such a way that they form the vertices of an equilateral triangle of side \(2a\). A particle \(X\) of mass \(m\) lies on the table. It is attached to the pegs by three springs, \(PX\), \(QX\) and \(RX\), each of modulus of elasticity \(\lambda\) and natural length \(l\), where \(l < \frac{ \ 2 }{\sqrt3}\, a\). Initially the particle is in equilibrium. Show that the extension in each spring is \(\frac{\ 2}{\sqrt3}\,a -l\,\). The particle is then pulled a small distance directly towards \(P\) and released. Show that the tension \(T\) in the spring \(RX\) is given by \[ T= \frac {\lambda} l \left( \sqrt{\frac {4a^2}3 + \frac{2ax}{\sqrt3} +x^2\; }\; -l\right) , \] where \(x\) is the displacement of \(X\) from its equilibrium position. Show further that the particle performs approximate simple harmonic motion with period \[ 2\pi \sqrt{ \frac{4mla}{3 (4a-\sqrt3 \, l)\lambda } \; }\,. \]

2016 Paper 3 Q10
D: 1700.0 B: 1484.0

A smooth plane is inclined at an angle \(\alpha\) to the horizontal. A particle \(P\) of mass \(m\) is attached to a fixed point \(A\) above the plane by a light inextensible string of length \(a\). The particle rests in equilibrium on the plane, and the string makes an angle \(\beta\) with the plane. The particle is given a horizontal impulse parallel to the plane so that it has an initial speed of \(u\). Show that the particle will not immediately leave the plane if \(ag\cos(\alpha + \beta)> u^2 \tan\beta\). Show further that a necessary condition for the particle to perform a complete circle whilst in contact with the plane is \(6\tan\alpha \tan \beta < 1\).

2016 Paper 3 Q11
D: 1700.0 B: 1484.0

A car of mass \(m\) travels along a straight horizontal road with its engine working at a constant rate \(P\). The resistance to its motion is such that the acceleration of the car is zero when it is moving with speed \(4U\).

  1. Given that the resistance is proportional to the car's speed, show that the distance~\(X_1\) travelled by the car while it accelerates from speed \(U\) to speed \(2U\), is given by \[ \lambda X_1 = 2\ln \tfrac 9 5 - 1 \,, \] where \(\lambda= P/(16mU^3)\).
  2. Given instead that the resistance is proportional to the square of the car's speed, show that the distance \(X_2\) travelled by the car while it accelerates from speed \(U\) to speed \(2U\) is given by \[ \lambda X_2 = \tfrac43 \ln \tfrac 98 \,. \]
  3. Given that \(3.17<\ln 24 < 3.18\) and \(1.60<\ln 5 < 1.61\), determine which is the larger of \(X_1\) and \(X_2\).

2016 Paper 3 Q12
D: 1700.0 B: 1516.0

Let \(X\) be a random variable with mean \(\mu\) and standard deviation \(\sigma\). Chebyshev's inequality, which you may use without proof, is \[ \P\left(\vert X-\mu\vert > k\sigma\right) \le \frac 1 {k^2} \,, \] where \(k\) is any positive number.

  1. The probability of a biased coin landing heads up is \(0.2\). It is thrown \(100n\) times, where \(n\) is an integer greater than 1. Let \(\alpha \) be the probability that the coin lands heads up \(N\) times, where \(16n \le N \le 24n\). Use Chebyshev's inequality to show that \[ \alpha \ge 1-\frac 1n \,. \]
  2. Use Chebyshev's inequality to show that \[ 1+ n + \frac{n^2}{ 2!} + \cdots + \frac {n^{2n}}{(2n)!} \ge \left(1-\frac1n\right) \e^n \,. \]


Solution:

  1. Let \(N\) be the number of times the coin lands heads up, ie \(N \sim Binomial(100n, 0.2)\), then \(\mathbb{E}(N) = \mu = 20n, \mathrm{Var}(N) = \sigma^2 = 100n \cdot 0.2 \cdot 0.8 = 16n \Rightarrow \sigma = 4\sqrt{n}\). \begin{align*} && \mathbb{P}(|X - \mu| > k\sigma) &\leq \frac{1}{k^2} \\ \Rightarrow && 1 - \mathbb{P}(|X - \mu| \leq k\sigma) &\leq \frac1{k^2} \\ \Rightarrow && 1 - \mathbb{P}(|X - 20n| \leq \sqrt{n} \cdot 4\sqrt{n}) &\leq \frac1{{\sqrt{n}}^2} \\ \Rightarrow && 1 - \mathbb{P}(16n \leq N \leq 24n) &\leq \frac{1}{n} \\ \Rightarrow && 1 - \frac1n &\leq \alpha \end{align*}
  2. Suppose \(X \sim Pois(n)\), then \(\mathbb{E}(X) = n, \mathrm{Var}(X) = n\). Therefore \begin{align*} && \mathbb{P}(|X - \mu| > k\sigma) &\leq \frac{1}{k^2} \\ \Rightarrow && 1-\mathbb{P}(|X - n| \leq \sqrt{n} \cdot \sqrt{n}) &> \frac{1}{\sqrt{n}^2} \\ \Rightarrow && 1 - \sum_{i=0}^{2n} \mathbb{P}(X = i) & \leq \frac{1}{n} \\ \Rightarrow && \sum_{i=0}^{2n} e^{-n} \frac{n^i}{i!} \geq 1 - \frac{1}{n} \\ \Rightarrow && \sum_{i=0}^{2n} \frac{n^i}{i!} \geq \left ( 1 - \frac1n \right)e^n \end{align*}

2016 Paper 3 Q13
D: 1700.0 B: 1500.0

Given a random variable \(X\) with mean \(\mu\) and standard deviation \(\sigma\), we define the kurtosis, \(\kappa\), of \(X\) by \[ \kappa = \frac{ \E\big((X-\mu)^4\big)}{\sigma^4} -3 \,. \] Show that the random variable \(X-a\), where \(a\) is a constant, has the same kurtosis as \(X\).

  1. Show by integration that a random variable which is Normally distributed with mean 0 has kurtosis 0.
  2. Let \(Y_1, Y_2, \ldots, Y_n\) be \(n\) independent, identically distributed, random variables with mean 0, and let \(T = \sum\limits_{r=1}^n Y_r\). Show that \[ \E(T^4) = \sum_{r=1}^n \E(Y_r^4) + 6 \sum_{r=1}^{n-1} \sum_{s=r+1}^{n} \E(Y^2_s) \E(Y^2_r) \,. \]
  3. Let \(X_1\), \(X_2\), \(\ldots\)\,, \(X_n\) be \(n\) independent, identically distributed, random variables each with kurtosis \(\kappa\). Show that the kurtosis of their sum is \(\dfrac\kappa n\,\).


Solution: \begin{align*} &&\kappa_{X-a} &= \frac{\mathbb{E}\left(\left(X-a-(\mu-a)\right)^4\right)}{\sigma_{X-a}^4}-3 \\ &&&= \frac{\mathbb{E}\left(\left(X-\mu\right)^4\right)}{\sigma_X^4}-3\\ &&&= \kappa_X \end{align*}

  1. \(\,\) \begin{align*} && \kappa &= \frac{\mathbb{E}((X-\mu)^4)}{\sigma^4} - 3 \\ &&&= \frac{\mathbb{E}((\mu+\sigma Z-\mu)^4)}{\sigma^4} - 3 \\ &&&= \frac{\mathbb{E}((\sigma Z)^4)}{\sigma^4} - 3 \\ &&&= \mathbb{E}(Z^4)-3\\ &&&= \int_{-\infty}^{\infty} x^4\frac{1}{\sqrt{2\pi}} \exp \left ( - \frac12x^2 \right)\d x -3 \\ &&&= \left [\frac{1}{\sqrt{2\pi}}x^{3} \cdot \left ( -\exp \left ( - \frac12x^2 \right)\right) \right]_{-\infty}^{\infty} + \frac{1}{\sqrt{2\pi}} \int_{-\infty}^\infty 3x^2 \exp \left ( - \frac12x^2 \right) \d x - 3 \\ &&&= 0 + 3 \textrm{Var}(Z) - 3 =0 \end{align*}
  2. \(\,\) \begin{align*} && \mathbb{E}(T^4) &= \mathbb{E} \left [\left ( \sum\limits_{r=1}^n Y_r\right)^4\right] \\ &&&= \mathbb{E} \left [ \sum_{r=1}^n Y_r^4+\sum_{i\neq j} 4Y_iY_j^3+\sum_{i\neq j} 6Y_i^2Y_j^2+\sum_{i\neq j \neq k} 12Y_iY_jY_k^2 +\sum_{i\neq j\neq k \neq l}24 Y_iY_jY_kY_l\right] \\ &&&= \sum_{r=1}^n \mathbb{E} \left [ Y_r^4 \right]+\sum_{i\neq j} \mathbb{E} \left [ 4Y_iY_j^3\right]+\sum_{i\neq j} \mathbb{E} \left [ 6Y_i^2Y_j^2\right]+\sum_{i\neq j \neq k} \mathbb{E} \left [ 12Y_iY_jY_k^2\right] +\sum_{i\neq j\neq k \neq l} \mathbb{E} \left [ 24 Y_iY_jY_kY_l\right] \\ &&&= \sum_{r=1}^n \mathbb{E} \left [ Y_r^4 \right]+4\sum_{i\neq j} \mathbb{E} \left [ Y_i]\mathbb{E}[Y_j^3\right]+6\sum_{i\neq j} \mathbb{E} \left [ Y_i^2]\mathbb{E}[Y_j^2\right]+12\sum_{i\neq j \neq k} \mathbb{E} \left [ Y_i]\mathbb{E}[Y_j]\mathbb{E}[Y_k^2\right] +24\sum_{i\neq j\neq k \neq l} \mathbb{E} \left [ Y_i]\mathbb{E}[Y_j]\mathbb{E}[Y_k]\mathbb{E}[Y_l\right] \\ &&&= \sum_{r=1}^n \mathbb{E} \left [ Y_r^4 \right]+6\sum_{i\neq j} \mathbb{E} \left [ Y_i^2]\mathbb{E}[Y_j^2\right] \end{align*}
  3. Without loss of generality, we may assume they all have mean zero. Therefore we can consider the sitatuion as in the previous case with \(T\) and \(Y_i\)s. Note that \(\mathbb{E}(Y_i^4) = \sigma^4(\kappa + 3)\) and \(\textrm{Var}(T) = n \sigma^2\) \begin{align*} && \kappa_T &= \frac{\mathbb{E}(T^4)}{(\textrm{Var}(T))^2} - 3 \\ &&&= \frac{\sum_{r=1}^n \mathbb{E} \left [ Y_r^4 \right]+6\sum_{i\neq j} \mathbb{E} \left [ Y_i^2\right]\mathbb{E}\left[Y_j^2\right]}{n^2\sigma^4}-3 \\ &&&= \frac{n\sigma^4(\kappa+3)+6\binom{n}{2}\sigma^4}{n^2\sigma^4} -3\\ &&&= \frac{\kappa}{n} + \frac{3n + \frac{6n(n-1)}{2}}{n^2} - 3 \\ &&&= \frac{\kappa}{n} + \frac{3n^2}{n^2}-3 \\ &&&= \frac{\kappa}{n} \end{align*}