Problems

Filters
Clear Filters

8 problems found

2018 Paper 3 Q13
D: 1700.0 B: 1484.0

The random variable \(X\) takes only non-negative integer values and has probability generating function \(\G(t)\). Show that \[ \P(X = 0 \text{ or } 2 \text{ or } 4 \text { or } 6 \ \ldots ) = \frac{1}{2}\big(\G\left(1\right)+\G\left(-1\right)\big). \] You are now given that \(X\) has a Poisson distribution with mean \(\lambda\). Show that \[ \G(t) = \e^{-\lambda(1-t)} \,. \]

  1. The random variable \(Y\) is defined by \[ \P(Y=r)= \begin{cases} k\P(X=r) & \text{if \(r=0, \ 2, \ 4, \ 6, \ \ldots\) \ }, \\[2mm] 0& \text{otherwise}, \end{cases} \] where \(k\) is an appropriate constant. Show that the probability generating function of \(Y\) is \(\dfrac{\cosh\lambda t}{\cosh\lambda}\,\). Deduce that \(\E(Y) < \lambda\) for \(\lambda > 0\,\).
  2. The random variable \(Z\) is defined by \[\P(Z=r)= \begin{cases} c \P(X=r) & \text{if \(r = 0, \ 4, \ 8, \ 12, \ \ldots \ \)}, \\[2mm] 0& \text{otherwise,} \end{cases} \] where \(c\) is an appropriate constant. Is \(\E(Z) < \lambda\) for all positive values of \(\lambda\,\)?


Solution: \begin{align*} &&G_X(t) &= \mathbb{E}(t^N) \\ &&&= \sum_{k=0}^{\infty} \mathbb{P}(X = k) t^k \\ \Rightarrow && G_X(1) &= \sum_{k=0}^{\infty} \mathbb{P}(X = k) \\ \Rightarrow && G_X(-1) &= \sum_{k=0}^{\infty} (-1)^k\mathbb{P}(X = k) \\ \Rightarrow && \frac12 (G_X(1) + G_X(-1) &= \sum_{k=0}^{\infty} \frac12 (1 + (-1)^k) \mathbb{P}(X = k) \\ &&&= \sum_{k=0}^{\infty} \mathbb{P}(X =2k) \end{align*}

  1. \begin{align*} 1 &= \sum_r \mathbb{P}(Y = r) \\ &= \sum_{k=0}^\infty k \cdot \mathbb{P}(X = 2k) \\ &= k \cdot \frac12 \l e^{-\lambda(1-1) } + e^{-\lambda(1+1) }\r \\ &= \frac{k}{2}(1+e^{-2\lambda}) \end{align*} Therefore \(k = \frac{2}{1+e^{-2\lambda}} = e^{\lambda} \frac{1}{\cosh \lambda}\) \begin{align*} && G_X(t) + G_X(-t) &= \sum_{k=0}^\infty \mathbb{P}(X = k)t^k(1^k + (-1)^k) \\ &&&= \sum_{k=0}^\infty \mathbb{P}(X = k)t^k(1^k + (-1)^k) \\ &&&= 2\sum_{k=0}^\infty \mathbb{P}(X = 2k)t^{2k} \\ &&&= 2\sum_{k=0}^\infty \frac{1}{k}\mathbb{P}(Y = 2k)t^{2k} \\ &&&= \frac{2}{k}G_Y(t) \\ \Rightarrow && G_Y(t) &= k \cdot \frac{G_X(t) + G_X(-t)}{2} \\ &&&= k\frac{e^{-\lambda(1-t)} + e^{-\lambda(1+t)}}{2} \\ &&&= \frac{e^\lambda}{\cosh \lambda} \frac{e^{-\lambda} (e^{\lambda t}+e^{-\lambda t}) }{2} \\ &&&= \frac{\cosh \lambda t}{\cosh \lambda} \end{align*} Since \(\mathbb{E}(Y) = G_Y'(1)\) and \begin{align*} && G_Y'(t) &= \frac{\lambda \sinh \lambda t}{\cosh \lambda t} \\ \Rightarrow && G_Y'(1) &= \lambda \tanh \lambda \\ &&&< \lambda \end{align*} since \(\tanh x < 1\)
  2. \begin{align*} && \frac14 \l G_X(t) + G_X(it) +G_X(-t) + G_X(-it) \r &= \sum_{k=0}^\infty \mathbb{P}(X=k)t^k (1 + i^k + (-1)^k + (-i)^k) \\ &&&= \sum_{k=0}^\infty \mathbb{P}(X = 4k)t^{4k} \\ &&&= \frac{G_Z(t)}{c} \end{align*} Since \(G_Z(1) = 1\) we must have \(c = \frac1{\frac14 \l G_X(1) + G_X(i) +G_X(-1) + G_X(-i) \r}\) \begin{align*} && c &= \frac{4e^{\lambda}}{e^{\lambda} + e^{-\lambda} + e^{i\lambda} + e^{-i\lambda}} \\ &&&= \frac{2e^{\lambda}}{\cosh \lambda + \cos \lambda} \\ && G_Z(t) &= c \cdot \frac14 \l e^{-\lambda(1-t)}+e^{-\lambda(1-it)}+e^{-\lambda(1+t)}+e^{-\lambda(1+it)} \r \\ &&&= \frac{ce^{-\lambda t}}{4} \l 2\cosh \lambda t + 2 \cos \lambda t\r \\ &&&= \frac{\cosh \lambda t + \cos \lambda t}{\cosh \lambda + \cos \lambda} \end{align*} We are interested in \(G_Z'(1)\) so: \begin{align*} && G_Z'(t) &= \frac{\lambda (\sinh \lambda t - \sin \lambda t)}{\cosh \lambda + \cos \lambda } \end{align*} Considering various values of \(\lambda\), it makes sense to look at \(\lambda = \pi\) (since \(\cos \lambda = -1\) and the denominator will be small). From this we can see: \begin{align*} G'_Z(1) &= \frac{\pi (\sinh \pi-0)}{\cosh \pi-1} \\ &= \frac{\pi}{\tanh \frac{\pi}{2}} > \pi \end{align*} So \(\mathbb{E}(Z)\) is larger than \(\lambda\) for \(\lambda = \pi\) (and probably many others)

2015 Paper 3 Q12
D: 1700.0 B: 1500.0

A 6-sided fair die has the numbers 1, 2, 3, 4, 5, 6 on its faces. The die is thrown \(n\) times, the outcome (the number on the top face) of each throw being independent of the outcome of any other throw. The random variable \(S_n\) is the sum of the outcomes.

  1. The random variable~\(R_n\) is the remainder when \(S_n\) is divided by 6. Write down the probability generating function, \(\G(x)\), of \(R_1\) and show that the probability generating function of \(R_2\) is also \(\G(x)\). Use a generating function to find the probability that \(S_n\) is divisible by 6.
  2. The random variable \(T_n\) is the remainder when \(S_n\) is divided by 5. Write down the probability generating function, \(\G_1(x)\), of \(T_1\) and show that \(\G_2(x)\), the probability generating function of \(T_2\), is given by \[ {\rm G}_2(x) = \tfrac 1 {36} (x^2 +7y) \] where \(y= 1+x+x^2+x^3+x^4\,\). Obtain the probability generating function of \(T_n\) and hence show that the probability that \(S_n\) is divisible by \(5\) is \[ \frac15\left(1- \frac1 {6^n}\right) \] if \(n\) is not divisible by 5. What is the corresponding probability if \(n\) is divisible by 5?


Solution:

  1. \(G(x) = \frac{1}{6} (1 + x + x^2 + x^3 + x^4 + x^5)\) The pgf for \(R_2\) is: \begin{align*} \frac1{36}x^2 + \frac{2}{36}x^3 + \frac{3}{36}x^4 + \frac{4}{36}x^5 + \frac{5}{36} +\\ \quad \quad + \frac{6}{36}x^1 + \frac{5}{36}x^2 + \frac4{36}x^3 + \frac3{36}x^4 + \frac{2}{36}x^5 + \frac{1}{36} \\ = \frac{1}{6}(1 + x + x^2 + x^3 + x^4 + x^5) = G(x) \end{align*} Since rolling the dice twice is the same as rolling the dice once, rolling the dice \(n\) times will be the same as rolling it once, ie the pgf for \(R_n\) will be \(G(x)\) and the probability \(S_n\) is divisible by \(6\) is \(\frac16\)
  2. \(G_1(x) = \frac{1}{6} + \frac{1}{3}x^1 + \frac{1}{6}x^2 + \frac16x^3+ \frac16x^4 = \frac16(1 + 2x+x^2+x^3+x^4)\). If \(G_n\) is the probability generating function for \(T_n\) then we can obtain \(G_n\) by multiplying \(G_{n-1}\) by \(G(x)\) and replacing any terms of order higher than \(5\) with their remainder on division by \(5\). (Or equivalently, working over \(\mathbb{R}[x]/(x^5-1)\). If \(y = 1 + x + x^2 + x^3 + x^4\) then: \begin{align*} xy &= x + x^2 + x^3 + x^4 +x^5 \\ &= x + x^2 + x^3 + x^4 + 1 \\ &= y \\ \\ y^2 &= (1 + x+x^2 + x^3+x^4)^2 \\ &= 1 + 2x + 3x^2 + 4x^3+5x^4+4x^5+3x^6 + 2x^7 + x^8 \\ &= (1+4) + (2+3)x+(3+2)x^2 + (4+1)x^3 + 5x^4 \\ &= 5y \end{align*} \begin{align*} \frac{1}{36}(y+x)(y+x) &= \frac1{36}(y^2 + 2xy + x^2) \\ &= \frac1{36}(5y + 2y + x^2 ) \\ &= \frac1{36}(7y + x^2) \end{align*} Similarly, \begin{align*} G_n(x) &= \l\frac{1}{6}(x+y) \r^n \\ &= \frac1{6^n} \l \sum_{i=0}^n \binom{n}{i} y^ix^{n-i} \r \\ &= \frac1{6^n} \l \sum_{i=1}^n \binom{n}{i} y^i + x^n \r \\ &= \frac1{6^n} \l \sum_{i=1}^n \binom{n}{i} 5^{i-1}y + x^n \r \\ &= \frac1{6^n} \l \frac{1}{5}y((5+1)^n-1) + x^n \r \\ &= \frac1{6^n} \l \frac{1}{5}y(6^n-1) + x^n \r \\ \end{align*} Therefore if \(n \not \equiv 0 \pmod{5}\), we can find the probability of \(T_n = 0\) by looking at the constant coefficient, ie plugging in \(x = 0\), which is: \[\frac1{6^n} \l \frac{1}{5}(6^n-1) \r = \frac{1}{5} \l 1- \frac{1}{6^n} \r \] When \(n \equiv 0 \pmod{5}\) we can also find the constant coefficient by plugging in \(x = 0\), which is: \[\frac1{6^n} \l \frac{1}{5}(6^n-1) + 1 \r = \frac{1}{5} \l 1+ \frac{4}{6^n} \r \]
Note: this whole question can be considered a "roots-of-unity" filter in disguise. Our computations in \(\mathbb{R}[x]/(x^5 - 1)\) are the same as computations using \(\omega\), in fact \(\mathbb{R}[x]/(x^5 - 1) \cong \mathbb{R}[\omega]\) where \(\omega\) is a primitive \(5\)th root of unity

2014 Paper 3 Q13
D: 1700.0 B: 1500.0

I play a game which has repeated rounds. Before the first round, my score is \(0\). Each round can have three outcomes:

  1. my score is unchanged and the game ends;
  2. my score is unchanged and I continue to the next round;
  3. my score is increased by one and I continue to the next round.
The probabilities of these outcomes are \(a\), \(b\) and \(c\), respectively (the same in each round), where \(a+b+c=1\) and \(abc\ne0\). The random variable \(N\) represents my score at the end of a randomly chosen game. Let \(G(t)\) be the probability generating function of \(N\).
  1. Suppose in the first round, the game ends. Show that the probability generating function conditional on this happening is 1.
  2. Suppose in the first round, the game continues to the next round with no change in score. Show that the probability generating function conditional on this happening is \(G(t)\).
  3. By comparing the coefficients of \(t^n\), show that $ G(t) = a + bG(t) + ctG(t)\,. $ Deduce that, for \(n\ge0\), \[ P(N=n) = \frac{ac^n}{(1-b)^{n+1}}\,. \]
  4. Show further that, for \(n\ge0\), \[ P(N=n) = \frac{\mu^n}{(1+\mu)^{n+1}}\,, \] where \(\mu=\E(N)\).


Solution:

  1. If the game ends in the first round then the score is exactly \(0\) and the pgf is \(1\cdot x^0 = 1\)
  2. If the game moves onto the next round with no change in the first round then it's as if nothing happened, therefore the pgf is the original pgf \(G(t)\)
  3. If the game moves into the next round with the score increased by one, then the pgf is \(tG(t)\) since all the scores are increased by \(1\). Therefore \begin{align*} && G(t) &= \E[t^N] \\ &&&= \E[\E[t^N | \text{first round}]] \\ &&&= a + bG(t) + ctG(t) \\ \Rightarrow && G(t)(1-b-ct) = a \\ \Rightarrow && G(t) &= \frac{a}{(1-b)-ct} \\ &&&= \frac{a}{(1-b)} \frac{1}{1- \left(\frac{c}{1-b}\right)t} \\ &&&= \sum_{n=0}^\infty \frac{a}{1-b} \frac{c^n}{(1-b)^n} t^n\\ &&&= \sum_{n=0}^{\infty} \frac{ac^n}{(1-b)^{n+1}}t^n \end{align*} Therefore by comparing coefficients, \(\mathbb{P}(N=n) = \frac{ac^n}{(1-b)^{n+1}}\)
  4. \(\,\) \begin{align*} && \E[N] &= G'(1) \\ &&&= \frac{ac}{((1-b)-c)^2} \\ &&&= \frac{ac}{a^2} = \frac{c}{a} \\ \\ && \frac{\mu^n}{(1+\mu)^{n+1}} &= \frac{c^na^{-n}}{(a+c)^{n+1}a^{-n-1}} \\ &&&= \frac{ac^n}{(a+c)^{n+1}}\\ &&&= \frac{ac^n}{(1-b)^{n+1}}\\ &&&= \mathbb{P}(N=n) \end{align*} as required

2011 Paper 3 Q12
D: 1700.0 B: 1516.0

The random variable \(N\) takes positive integer values and has pgf (probability generating function) \(\G(t)\). The random variables \(X_i\), where \(i=1\), \(2\), \(3\), \(\ldots,\) are independently and identically distributed, each with pgf \({H}(t)\). The random variables \(X_i\) are also independent of \(N\). The random variable \(Y\) is defined by \[ Y= \sum_{i=1}^N X_i \;. \] Given that the pgf of \(Y\) is \(\G(H(t))\), show that \[ \E(Y) = \E(N)\E(X_i) \text{ and } \var(Y) = \var(N)\big(\E(X_i)\big)^2 + \E(N) \var(X_i) \,.\] A fair coin is tossed until a head occurs. The total number of tosses is \(N\). The coin is then tossed a further \(N\) times and the total number of heads in these \(N\) tosses is \(Y\). Find in this particular case the pgf of \(Y\), \(\E(Y)\), \(\var(Y)\) and \(\P(Y=r)\).


Solution: Recall that for a random variable \(Z\) with pgf \(F(t)\) we have \(F(1) = 1\), \(\E[Z] = F'(1)\) and \(\E[Z^2] = F''(1) +F'(1)\) so \begin{align*} && \E[Y] &= G'(H(1))H'(1) \\ &&&= G'(1)H'(1) \\ &&&= \E[N]\E[X_i] \\ \\ && \E[Y^2] &= G''(H(1))(H'(1))^2+G'(H(1))H''(1) + G'(H(1))H'(1) \\ &&&= G''(1)(H'(1))^2+G'(1)H''(1) + G'(1)H'(1) \\ &&&= (\E[N^2]-\E[N])(\E[X_i])^2 + \E[N](\E[X_i^2]-\E[X_i]) + \E[N]\E[X_i] \\ &&&= (\E[N^2]-\E[N])(\E[X_i])^2 + \E[N]\E[X_i^2] \\ && \var[Y] &= (\E[N^2]-\E[N])(\E[X_i])^2 + \E[N]\E[X_i^2] - (\E[N])^2(\E[X_i])^2\\ &&&= (\var[N]+(\E[N])^2-\E[N])(\E[X_i])^2 + \E[N](\var[X_i]+\E[X_i]^2) - (\E[N])^2(\E[X_i])^2\\ &&&= \var[N](\E[X_i])^2 + \E[N]\var[X_i] \end{align*} Notice that \(N \sim Geo(\tfrac12)\) and \(Y = \sum_{i=1}^N X_i\) where \(X_i\) are Bernoulli. We have that \(G(t) = \frac{\frac12}{1-\frac12z}\) and \(H(t) = \frac12+\frac12p\) so the pgf of \(Y\) is \(G(H(t) = \frac{\frac12}{1 - \frac14-\frac14p} = \frac{2}{3-p}\). \begin{align*} && \E[X_i] &= \frac12\\ && \var[X_i] &= \frac14 \\ && \E[N] &= 2 \\ && \var[N] &= 2 \\ \\ && \E[Y] &= 2 \cdot \frac12 = 1 \\ && \var[Y] &= 2 \cdot \frac14 + 2 \frac14 = 1 \\ && \mathbb{P}(Y=r) &= \tfrac23 \left ( \tfrac13 \right)^r \end{align*}

2009 Paper 3 Q12
D: 1700.0 B: 1516.0

  1. Albert tosses a fair coin \(k\) times, where \(k\) is a given positive integer. The number of heads he gets is \(X_1\). He then tosses the coin \(X_1\) times, getting \(X_2\) heads. He then tosses the coin \(X_2\) times, getting \(X_3\) heads. The random variables \(X_4\), \(X_5\), \(\ldots\) are defined similarly. Write down \(\E(X_1)\). By considering \(\E(X_2 \; \big\vert \; X_1 = x_1)\), or otherwise, show that \(\E(X_2) = \frac14 k\). Find \(\displaystyle \sum_{i=1}^\infty \E(X_i)\).
  2. Bertha has \(k\) fair coins. She tosses the first coin until she gets a tail. The number of heads she gets before the first tail is \(Y_1\). She then tosses the second coin until she gets a tail and the number of heads she gets with this coin before the first tail is \(Y_2\). The random variables \(Y_3, Y_4, \ldots\;\), \(Y_k\) are defined similarly, and \(Y= \sum\limits_{i=1}^k Y_i\,\). Obtain the probability generating function of \(Y\), and use it to find \(\E(Y)\), \(\var(Y)\) and \(\P(Y=r)\).


Solution:

  1. \(X_1 \sim B(k, \tfrac12)\) so \(\E[X_1] = \frac{k}{2}\) Note that \(X_2 | X_1 = x_1 \sim B(x_1, \tfrac12)\) so \(\E[X_2 | X_1 = x_1) = \frac{x_1}{2}\) or \(\E[X_2 | X_1] = \frac12 X_1\). Therefore by the tower law, \(\E[\E[X_2|X_1]] = \E[\frac12 X_1] = \frac14k\) Notice also that \(\E[X_n] = \frac1{2^n} k\) and so \begin{align*} && \sum_{i=1}^\infty \E[X_i] &= \sum_{i=1}^{\infty} \frac1{2^i} k \\ &&&= \frac{\frac12 k}{1-\frac12} = k \end{align*}
  2. Note that \(Y_1 \sim Geo(\tfrac12)-1\) which has generating function \(\E[t^{Y_1}] = \E[t^{G-1}] = \frac{\frac12 t}{1-(1-\frac12)t}\frac1{t} = \frac{\frac12}{1-\frac12t}\). Notice that \begin{align*} && \E \left [ t^Y \right] &= \E \left [ t^{\sum_{i=1}^kY_i} \right] \\ &&&= \prod_{i=1}^k \E[t^{Y_i}] \\ &&&= \frac{1}{(2-t)^k} \end{align*} Therefore \(\E[Y] = G'(1) = k(2-1)^{-(k+1)} = k\) \(\E[Y^2] = (tG'(t))'|_{t=1} = k(k+1)(2-1)^{-(k+2)}+k(2-1)^{-(k+1)} = k^2+2k\) so \(\var[Y] = k^2+2k - k^2 =2 k\). Finally \(\mathbb{P}(Y=r) = \binom{k+r-1}{k} \frac{1}{2^{r+k}}\)
[Note: this second distribution is a negative binomial distribution]

2003 Paper 3 Q14
D: 1700.0 B: 1599.8

Write down the probability generating function for the score on a standard, fair six-faced die whose faces are labelled \(1, 2, 3, 4, 5, 6\). Hence show that the probability generating function for the sum of the scores on two standard, fair six-faced dice, rolled independently, can be written as \[ \frac1{36} t^2 \l 1 + t \r^2 \l 1 - t + t^2 \r^2 \l 1 + t + t^2 \r^2 \;. \] Write down, in factorised form, the probability generating functions for the scores on two fair six-faced dice whose faces are labelled with the numbers \(1, 2, 2, 3, 3, 4\) and \(1, 3, 4, 5, 6, 8,\) and hence show that when these dice are rolled independently, the probability of any given sum of the scores is the same as for the two standard fair six-faced dice. Standard, fair four-faced dice are tetrahedra whose faces are labelled \(1, 2, 3, 4,\) the score being taken from the face which is not visible after throwing, and each score being equally likely. Find all the ways in which two fair four-faced dice can have their faces labelled with positive integers if the probability of any given sum of the scores is to be the same as for the two standard fair four-faced dice.

2000 Paper 3 Q13
D: 1700.0 B: 1516.0

A set of \(n\) dice is rolled repeatedly. For each die the probability of showing a six is \(p\). Show that the probability that the first of the dice to show a six does so on the \(r\)th roll is $$q^{n r } ( q^{-n} - 1 )$$ where \(q = 1 - p\). Determine, and simplify, an expression for the probability generating function for this distribution, in terms of \(q\) and \(n\). The first of the dice to show a six does so on the \(R\)th roll. Find the expected value of \(R\) and show that, in the case \(n = 2\), \(p=1/6\), this value is \(36/11\). Show that the probability that the last of the dice to show a six does so on the \(r\)th roll is \[ \big(1-q^r\big)^n-\big(1-q^{r-1}\big)^n. \] Find, for the case \(n = 2\), the probability generating function. The last of the dice to show a six does so on the \(S\)th roll. Find the expected value of \(S\) and evaluate this when \(p=1/6\).

1997 Paper 3 Q12
D: 1700.0 B: 1500.0

  1. I toss a biased coin which has a probability \(p\) of landing heads and a probability \(q=1-p\) of landing tails. Let \(K\) be the number of tosses required to obtain the first head and let \[ \mathrm{G}(s)=\sum_{k=1}^{\infty}\mathrm{P}(K=k)s^{k}. \] Show that \[ \mathrm{G}(s)=\frac{ps}{1-qs} \] and hence find the expectation and variance of \(K\).
  2. I sample cards at random with replacement from a normal pack of \(52\). Let \(N\) be the total number of draws I make in order to sample every card at least once. By expressing \(N\) as a sum \(N=N_{1}+N_{2}+\cdots+N_{52}\) of random variables, or otherwise, find the expectation of \(N\). Estimate the numerical value of this expectation, using the approximations \(\mathrm{e}\approx2.7\) and \(1+\frac{1}{2}+\frac{1}{3}+\cdots+\frac{1}{n}\approx0.5+\ln n\) if \(n\) is large.


Solution:

  1. Let \(N_i\) be the number of draws between the \((i-1)\)th new card and the \(i\)th new card. (Where \(N_1 = 1\)0 then \(N_i \sim K\) with \(p = \frac{53-i}{52}\)). Therefore \begin{align*} \E[N] &= \E[N_1 + \cdots + N_{52}] \\ &= \E[N_1] + \cdots + \E[N_i] + \cdots + \E[N_{52}] \\ &= 1 + \frac{52}{51} + \cdots + \frac{52}{53-k} + \cdots + \frac{52}{1} \\ &= 52 \left (1 + \frac{1}{2} + \cdots + \frac{1}{52} \right) \\ &= 52 \cdot \left ( 1 + \ln 52 \right) \end{align*} Notice that \(2.7 \times 2.7 = 7.29\) and \(7.3 \times 7.3 \approx 53.3\) so \(\ln 52 \approx 4\) and so our number is \(\approx 52 \cdot 4.5 =234\). [The correct answer actual number is 235.9782]