A car of mass \(m\) travels along a straight horizontal road with
its engine working at a
constant rate \(P\). The resistance to its motion
is such that
the acceleration of the car is zero
when it is moving with speed
\(4U\).
Given that the resistance is proportional to the car's speed,
show that
the distance~\(X_1\) travelled by the car
while
it accelerates from speed \(U\) to speed \(2U\),
is given by
\[
\lambda X_1 = 2\ln \tfrac 9 5 - 1
\,,
\]
where \(\lambda= P/(16mU^3)\).
Given instead that the resistance is proportional to the square
of the car's speed, show that
the distance \(X_2\) travelled by the car while it accelerates from
speed \(U\) to speed \(2U\)
is given by
\[
\lambda X_2 = \tfrac43 \ln \tfrac 98
\,.
\]
Given that \(3.17<\ln 24 < 3.18\) and \(1.60<\ln 5 < 1.61\),
determine which is the larger of
\(X_1\) and \(X_2\).
Let \(X\) be a random variable with mean \(\mu\) and standard deviation
\(\sigma\). Chebyshev's inequality, which you may use without proof,
is
\[
\P\left(\vert X-\mu\vert > k\sigma\right) \le \frac 1 {k^2}
\,,
\]
where \(k\) is any positive number.
The probability of a biased coin landing heads up is \(0.2\). It is thrown \(100n\) times, where \(n\) is an integer greater than 1. Let \(\alpha \) be the probability that the coin lands heads up \(N\) times, where \(16n \le N \le 24n\).
Use Chebyshev's inequality to show that
\[
\alpha \ge 1-\frac 1n
\,.
\]
Use Chebyshev's inequality to show that
\[
1+ n + \frac{n^2}{ 2!} + \cdots + \frac {n^{2n}}{(2n)!} \ge
\left(1-\frac1n\right) \e^n
\,.
\]
Given a random variable \(X\) with mean \(\mu\) and standard deviation \(\sigma\), we define the kurtosis, \(\kappa\), of \(X\) by
\[
\kappa = \frac{ \E\big((X-\mu)^4\big)}{\sigma^4} -3 \,.
\]
Show that the random variable \(X-a\), where \(a\) is a constant, has the same kurtosis as \(X\).
Show by integration that a random variable which
is Normally distributed with mean 0 has kurtosis 0.
Let \(Y_1, Y_2, \ldots, Y_n\) be \(n\) independent, identically distributed, random variables with mean 0, and let \(T = \sum\limits_{r=1}^n Y_r\). Show that
\[
\E(T^4) = \sum_{r=1}^n \E(Y_r^4) +
6 \sum_{r=1}^{n-1} \sum_{s=r+1}^{n} \E(Y^2_s)
\E(Y^2_r)
\,.
\]
Let \(X_1\), \(X_2\), \(\ldots\)\,, \(X_n\) be \(n\) independent, identically distributed, random variables each with kurtosis \(\kappa\). Show that the kurtosis of their sum is \(\dfrac\kappa n\,\).
Without loss of generality, we may assume they all have mean zero. Therefore we can consider the sitatuion as in the previous case with \(T\) and \(Y_i\)s. Note that \(\mathbb{E}(Y_i^4) = \sigma^4(\kappa + 3)\) and \(\textrm{Var}(T) = n \sigma^2\)
\begin{align*}
&& \kappa_T &= \frac{\mathbb{E}(T^4)}{(\textrm{Var}(T))^2} - 3 \\
&&&= \frac{\sum_{r=1}^n \mathbb{E} \left [ Y_r^4 \right]+6\sum_{i\neq j} \mathbb{E} \left [ Y_i^2\right]\mathbb{E}\left[Y_j^2\right]}{n^2\sigma^4}-3 \\
&&&= \frac{n\sigma^4(\kappa+3)+6\binom{n}{2}\sigma^4}{n^2\sigma^4} -3\\
&&&= \frac{\kappa}{n} + \frac{3n + \frac{6n(n-1)}{2}}{n^2} - 3 \\
&&&= \frac{\kappa}{n} + \frac{3n^2}{n^2}-3 \\
&&&= \frac{\kappa}{n}
\end{align*}