40 problems found
Solution:
Let \(X\) and \(Y\) be independent standard normal random variables: the probability density function, \(\f\), of each is therefore given by \[ \f(x)=\left(2\pi\right)^{-\frac{1}{2}}\e^{-\frac{1}{2}x^{2}}. \]
Solution:
An industrial process produces rectangular plates of mean length \(\mu_{1}\) and mean breadth \(\mu_{2}\). The length and breadth vary independently with non-zero standard deviations \(\sigma_{1}\) and \(\sigma_{2}\) respectively. Find the means and standard deviations of the perimeter and of the area of the plates. Show that the perimeter and area are not independent.
Solution: Let \(L \sim N(\mu_1, \sigma_1^2)\), \(B \sim N(\mu_2, \sigma_2)^2\), so \begin{align*} && \mathbb{E}(\text{perimeter}) &= \E(2(L+B)) \\ &&&= 2\E[L]+2\E[B] \\ &&&= 2(\mu_1+\mu_2) \\ &&\var[\text{perimeter}] &= \E\left [ (2(L+B))^2 \right] - \left ( \E[2(L+B)] \right)^2 \\ &&&= 4\E[L^2+2LB+B^2] - 4(\mu_1+\mu_2)^2 \\ &&&= 4(\sigma_1^2+\mu_1^2+2\mu_1\mu_2+\sigma_2^2+\mu_2^2) - 4(\mu_1+\mu_2)^2\\ &&&= 4(\sigma_1^2+\sigma_2^2) \\ &&\text{sd}[\text{perimeter}] &= 2\sqrt{\sigma_1^2+\sigma_2^2} \\ \\ && \E[\text{area}] &= \E[LB] \\ &&&= \E[L]\E[B] \\ &&&= \mu_1\mu_2 \\ && \var[\text{area}] &= \E[(LB)^2] - \left (\E[LB] \right)^2 \\ &&&= \E[L^2]\E[B^2]-\mu_1^2\mu_2^2 \\ &&&= (\mu_1^2+\sigma_1^2)(\mu_2^2+\sigma_2^2) -\mu_1^2\mu_2^2 \\ &&&= \sigma_1^2\mu_2^2 + \sigma_2^2\mu_1^2 + \sigma_1^2\sigma_2^2\\ && \text{sd}(\text{area}) &= \sqrt{\sigma_1^2\mu_2^2 + \sigma_2^2\mu_1^2 + \sigma_1^2\sigma_2^2} \\ \\ && \E[\text{perimeter} \cdot \text{area}] &= \E[2(L+B)LB] \\ &&&= 2\E[L^2]\E[B] + 2\E[L]\E[B^2] \\ &&&= 2(\sigma_1^2+\mu_1^2)\mu_2 + 2(\sigma_2^2+\mu_2^2)\mu_1 \\ && \E[\text{perimeter}] \E[\text{area}] &= 2(\mu_1+\mu_2) \cdot \mu_1\mu_2 \end{align*} Since the latter does not depend on \(\sigma_i\) but the former does they cannot be equal in general, therefore they cannot be independent. [See also STEP 2006 Paper 3 Q14]
I have a Penny Black stamp which I want to sell to my friend Jim, but we cannot agree a price. So I put the stamp under one of two cups, jumble them up, and let Jim guess which one it is under. If he guesses correctly, I add a third cup, jumble them up, and let Jim guess correctly, adding another cup each time. The price he pays for the stamp is \(\pounds N,\) where \(N\) is the number of cups present when Jim fails to guess correctly. Find \(\mathrm{P}(N=k)\). Show that \(\mathrm{E}(N)=\mathrm{e}\) and calculate \(\mathrm{Var}(N).\)
Solution: \begin{align*} && \mathbb{P}(N = k) &= \mathbb{P}(\text{guesses }k-1\text{ correctly then 1 wrong})\\ &&&= \frac12 \cdot \frac{1}{3} \cdots \frac{1}{k-1} \frac{k-1}{k} \\ &&&= \frac{k-1}{k!} \\ &&\mathbb{E}(N) &= \sum_{k=2}^\infty k \cdot \mathbb{P}(N=k) \\ &&&= \sum_{k=2}^{\infty} \frac{k(k-1)}{k!} \\ &&&= \sum_{k=0}^{\infty} \frac{1}{k!} = e \\ && \textrm{Var}(N) &= \mathbb{E}(N^2) - \mathbb{E}(N)^2 \\ && \mathbb{E}(N^2) &= \sum_{k=2}^{\infty} k^2 \mathbb{P}(N=k) \\ &&&= \sum_{k=2}^{\infty} \frac{k^2(k-1)}{k!} \\ &&&= \sum_{k=0}^{\infty} \frac{k+2}{k!} \\ &&&= \sum_{k=0}^{\infty} \frac{1}{k!} + 2 \sum_{k=0}^{\infty} \frac{1}{k!} = 3e \\ \Rightarrow && \textrm{Var}(N) &= 3e-e^2 \end{align*}
Solution:
Let \(X\) be a random variable which takes only the finite number of different possible real values \(x_{1},x_{2},\ldots,x_{n}.\) Define the expectation \(\mathbb{E}(X)\) and the variance \(\var(X)\) of \(X\). Show that , if \(a\) and \(b\) are real numbers, then \(\E(aX+b)=a\E(X)+b\) and express \(\var(aX+b)\) similarly in terms of \(\var(X)\). Let \(\lambda\) be a positive real number. By considering the contribution to \(\var(X)\) of those \(x_{i}\) for which \(\left|x_{i}-\E(X)\right|\geqslant\lambda,\) or otherwise, show that \[ \mathrm{P}\left(\left|X-\E(X)\right|\geqslant\lambda\right)\leqslant\frac{\var(X)}{\lambda^{2}}\,. \] Let \(k\) be a real number satisfying \(k\geqslant\lambda.\) If \(\left|x_{i}-\E(X)\right|\leqslant k\) for all \(i\), show that \[ \mathrm{P}\left(\left|X-\E(X)\right|\geqslant\lambda\right)\geqslant\frac{\var(X)-\lambda^{2}}{k^{2}-\lambda^{2}}\,. \]
Solution: Definition: \(\displaystyle \mathbb{E}(X) = \sum_{i=1}^n x_i \mathbb{P}(X = x_i)\) Definition: \(\displaystyle \mathrm{Var}(X) = \sum_{i=1}^n (x_i-\mathbb{E}(X))^2 \mathbb{P}(X = x_i)\) Claim: \(\mathbb{E}(aX+b) = a\mathbb{E}(X)+b\) Proof: \begin{align*} \mathbb{E}(aX+b) &= \sum_{i=1}^n (ax_i+b) \mathbb{P}(X = x_i) \\ &= a\sum_{i=1}^n x_i \mathbb{P}(X = x_i) + b\sum_{i=1}^n \mathbb{P}(X = x_i)\\ &= a \mathbb{E}(X) + b \end{align*} Claim: \(\mathrm{Var}(aX+b) = a^2 \mathrm{Var}(X)\) Claim: \(\mathrm{P}\left(\left|X-\mathrm{E}(X)\right|\geqslant\lambda\right)\leqslant\frac{\mathrm{var}(X)}{\lambda^{2}}\) Proof: \begin{align*} \mathrm{Var}(X) &= \sum_{i=1}^n (x_i-\mathbb{E}(X))^2 \mathbb{P}(X = x_i) \\ &\geq \sum_{|x_i - \mathbb{E}(X)| \geq \lambda} (x_i-\mathbb{E}(X))^2 \mathbb{P}(X = x_i) \\ &\geq \sum_{|x_i - \mathbb{E}(X)| \geq \lambda} \lambda^2 \mathbb{P}(X = x_i) \\ &= \lambda^2 \sum_{|x_i - \mathbb{E}(X)| \geq \lambda} \mathbb{P}(X = x_i) \\ &= \lambda^2 \mathrm{P}\left(\left|X-\mathrm{E}(X)\right|\geqslant\lambda\right) \end{align*} Claim: \[ \mathrm{P}\left(\left|X-\mathrm{E}(X)\right|\geqslant\lambda\right)\geqslant\frac{\mathrm{var}(X)-\lambda^{2}}{k^{2}-\lambda^{2}}\,. \] Proof: \begin{align*} && \mathrm{Var}(X) &= \sum_{i=1}^n (x_i-\mathbb{E}(X))^2 \mathbb{P}(X = x_i) \\ &&&= \sum_{|x_i - \mathbb{E}(X)| \geq \lambda} (x_i-\mathbb{E}(X))^2 \mathbb{P}(X = x_i) + \sum_{|x_i - \mathbb{E}(X)| < \lambda} (x_i-\mathbb{E}(X))^2 \mathbb{P}(X = x_i) \\ &&& \leq \sum_{|x_i - \mathbb{E}(X)| \geq \lambda} k^2 \mathbb{P}(X = x_i) + \sum_{|x_i - \mathbb{E}(X)| < \lambda} \lambda^2 \mathbb{P}(X = x_i) \\ &&&= k^2 \mathbb{P}\left(\left|X-\mathrm{E}(X)\right|\geqslant\lambda\right) + \lambda^2 \mathbb{P}\left(\left|X-\mathrm{E}(X)\right| < \lambda\right) \\ &&&= k^2 \mathbb{P}\left(\left|X-\mathrm{E}(X)\right|\geqslant\lambda\right) + \lambda^2(1- \mathbb{P}\left(\left|X-\mathrm{E}(X)\right| \leq \lambda\right) \\ &&&= (k^2 - \lambda^2) \mathbb{P}\left(\left|X-\mathrm{E}(X)\right|\geqslant\lambda\right) + \lambda^2 \\ \Rightarrow&& \frac{\mathrm{Var}(X)-\lambda^2}{k^2 - \lambda^2} &\leq \mathbb{P}\left(\left|X-\mathrm{E}(X)\right|\geqslant\lambda\right) \end{align*} [Note: This result is known as Chebyshev's inequality, and is an important starting point to understanding the behaviour of tails of random variables]
Suppose \(X\) is a random variable with probability density \[ \mathrm{f}(x)=Ax^{2}\exp(-x^{2}/2) \] for \(-\infty < x < \infty.\) Find \(A\). You belong to a group of scientists who believe that the outcome of a certain experiment is a random variable with the probability density just given, while other scientists believe that the probability density is the same except with different mean (i.e. the probability density is \(\mathrm{f}(x-\mu)\) with \(\mu\neq0\)). In each of the following two cases decide whether the result given would shake your faith in your hypothesis, and justify your answer.
Solution: Let \(Z \sim N(0,1)\), with a pdf of \(f(x) = \frac{1}{\sqrt{2\pi}} \exp(-x^2/2)\) \begin{align*} && 1 &= \int_{-\infty}^\infty Ax^2 \exp(-x^2/2) \d x \\ &&&= A\sqrt{2\pi} \int_{-\infty}^\infty x^2 \frac{1}{\sqrt{2\pi}} \exp(-x^2/2) \d x \\ &&&= A\sqrt{2\pi} \E[Z^2] = A\sqrt{2\pi} \\ \Rightarrow && A &= \frac{1}{\sqrt{2\pi}} \end{align*}
When Septimus Moneybags throws darts at a dart board they are certain to end on the board (a disc of radius \(a\)) but, it must be admitted, otherwise are uniformly randomly distributed over the board.
Solution:
The continuous random variable \(X\) is uniformly distributed over the interval \([-c,c].\) Write down expressions for the probabilities that:
Solution:
Balls are chosen at random without replacement from an urn originally containing \(m\) red balls and \(M-m\) green balls. Find the probability that exactly \(k\) red balls will be chosen in \(n\) choices \((0\leqslant k\leqslant m,0\leqslant n\leqslant M).\) The random variables \(X_{i}\) \((i=1,2,\ldots,n)\) are defined for \(n\leqslant M\) by \[ X_{i}=\begin{cases} 0 & \mbox{ if the \(i\)th ball chosen is green}\\ 1 & \mbox{ if the \(i\)th ball chosen is red. } \end{cases} \] Show that
Solution: There are \(\displaystyle \binom{m}{k} \binom{M-m}{n-k}\) ways to choose \(k\) red and and \(n-k\) green balls out of a total \(\displaystyle \binom{M}{n}\) ways to choose balls. Therefore the probability is: \[ \mathbb{P}(\text{exactly }k\text{ red balls in }n\text{ choices}) = \frac{\binom{m}{k} \binom{M-m}{n-k}}{ \binom{M}{n}}\]