6 problems found
The domain of the function f is the set of all \(2 \times 2\) matrices and its range is the set of real numbers. Thus, if \(M\) is a \(2 \times 2\) matrix, then \(f(M) \in \mathbb{R}\). The function f has the property that \(f(MN) = f(M)f(N)\) for any \(2 \times 2\) matrices \(M\) and \(N\).
Solution:
The matrix A is given by $$\mathbf{A} = \begin{pmatrix} a & b \\ c & d \end{pmatrix}.$$
Solution:
The exponential of a square matrix \({\bf A}\) is defined to be $$ \exp ({\bf A}) = \sum_{r=0}^\infty {1\over r!} {\bf A}^r \,, $$ where \({\bf A}^0={\bf I}\) and \(\bf I\) is the identity matrix. Let $$ {\bf M}=\left(\begin{array}{cc} 0 & -1 \\ 1 & \phantom{-} 0 \end{array} \right) \,. $$ Show that \({\bf M}^2=-{\bf I}\) and hence express \(\exp({\theta {\bf M}})\) as a single \(2\times 2\) matrix, where \(\theta\) is a real number. Explain the geometrical significance of \(\exp({\theta {\bf M}})\). Let $$ {\bf N}=\left(\begin{array}{rr} 0 & 1 \\ 0 & 0 \end{array}\right) \,. $$ Express similarly \(\exp({s{\bf N}})\), where \(s\) is a real number, and explain the geometrical significance of \(\exp({s{\bf N}})\). For which values of \(\theta\) does $$ \exp({s{\bf N}})\; \exp({\theta {\bf M}})\, = \, \exp({\theta {\bf M}})\;\exp({s{\bf N}}) $$ for all \(s\)? Interpret this fact geometrically.
Solution: \begin{align*} \mathbf{M}^2 &= \begin{pmatrix} 0 & - 1 \\ 1 & 0 \end{pmatrix}^2 \\ &= \begin{pmatrix} 0 \cdot 0 + (-1) \cdot 1 & 0 \cdot (-1) + (-1) \cdot 0 \\ 1 \cdot 0 + 0 \cdot 1 & 1 \cdot (-1) + 0 \cdot 0 \end{pmatrix} \\ &= \begin{pmatrix} -1 & 0 \\ 0 & -1\end{pmatrix} \\ &= - \mathbf{I} \end{align*} \begin{align*} \exp(\theta \mathbf{M}) &= \sum_{r=0}^\infty \frac1{r!} (\theta \mathbf{M})^r \\ &= \sum_{r=0}^\infty \frac{1}{r!} \theta^r \mathbf{M}^r \\ &= \cos \theta \mathbf{I} + \sin \theta \mathbf{M} \\ &= \begin{pmatrix} \cos \theta & -\sin \theta \\ \sin \theta & \cos \theta \end{pmatrix} \end{align*} This is a rotation of \(\theta\) degrees about the origin. \begin{align*} && \mathbf{N}^2 &= \begin{pmatrix} 0 & 1 \\ 0 & 0 \end{pmatrix}^2 \\ && &= \begin{pmatrix} 0 & 0 \\ 0 & 0 \end{pmatrix} \\ \Rightarrow && \exp(s\mathbf{N}) &= \sum_{r=0}^\infty \frac{1}{r!} (s\mathbf{N})^r \\ &&&= \mathbf{I} + s \mathbf{N} \\ &&&= \begin{pmatrix} 1 &s \\ 0 & 1 \end{pmatrix} \end{align*} This is a shear, leaving the \(y\)-axis invariant, sending \((1,1)\) to \((1+s, 1)\). Suppose those matrices commute, for all \(s\), ie \begin{align*} && \begin{pmatrix} 1 &s \\ 0 & 1 \end{pmatrix}\begin{pmatrix} \cos \theta & -\sin \theta \\ \sin \theta & \cos \theta \end{pmatrix} &= \begin{pmatrix} \cos \theta & -\sin \theta \\ \sin \theta & \cos \theta \end{pmatrix}\begin{pmatrix} 1 &s \\ 0 & 1 \end{pmatrix} \\ \Rightarrow && \begin{pmatrix} \cos \theta - s \sin \theta & -\sin \theta + s \cos \theta \\ \sin \theta & \cos \theta \end{pmatrix} &= \begin{pmatrix} \cos \theta & s \cos \theta - \sin \theta \\ \sin \theta & s \sin \theta + \cos \theta \end{pmatrix} \\ \Rightarrow && \sin \theta &= 0 \\ \Rightarrow && \theta &=n \pi, n \in \mathbb{Z} \end{align*} Clearly it doesn't matter when we do nothing. If we are rotating by \(\pi\) then it also doesn't matter which order we do it in as the stretch happens in both directions equally.
In this question, \(\mathbf{A,\mathbf{B\) }}and \(\mathbf{X\) are non-zero \(2\times2\) real matrices.} Are the following assertions true or false? You must provide a proof or a counterexample in each case.
Solution:
The matrices \(\mathbf{I}\) and \(\mathbf{J}\) are \[ \mathbf{I}=\begin{pmatrix}1 & 0\\ 0 & 1 \end{pmatrix}\quad\mbox{ and }\quad\mathbf{J}=\begin{pmatrix}1 & 1\\ 1 & 1 \end{pmatrix} \] respectively and \(\mathbf{A}=\mathbf{I}+a\mathbf{J},\) where \(a\) is a non-zero real constant. Prove that \[ \mathbf{A}^{2}=\mathbf{I}+\tfrac{1}{2}[(1+2a)^{2}-1]\mathbf{J}\quad\mbox{ and }\quad\mathbf{A}^{3}=\mathbf{I}+\tfrac{1}{2}[(1+2a)^{3}-1]\mathbf{J} \] and obtain a similar form for \(\mathbf{A}^{4}.\) If \(\mathbf{A}^{k}=\mathbf{I}+p_{k}\mathbf{J},\) suggest a suitable form for \(p_{k}\) and prove that it is correct by induction, or otherwise.
Solution: If $\mathbf{J}=\begin{pmatrix}1 & 1\\ 1 & 1 \end{pmatrix}\(, them \)\mathbf{J}^2=\begin{pmatrix}2 & 2\\ 2 & 2 \end{pmatrix} = 2\mathbf{J}\(. Therefore \)\mathbf{J}^n = 2\mathbf{J}^{n-1} = 2^{n-1}\mathbf{J}$ Let \(\mathbf{A}=\mathbf{I}+a\mathbf{J}\) then \begin{align*} \mathbf{A}^2 &=\l \mathbf{I}+a\mathbf{J}\r^2 \\ &= \mathbf{I}+2a\mathbf{J} + a^2\mathbf{J}^2 \\ &= \mathbf{I}+2a\mathbf{J} + 2a^2\mathbf{J} \\ &= \mathbf{I}+(2a+ 2a^2)\mathbf{J} \\ &= \mathbf{I}+\frac12(1+4a+ 4a^2-1)\mathbf{J} \\ &= \mathbf{I}+\frac12((1+2a)^2-1)\mathbf{J} \\ \end{align*} \begin{align*} \mathbf{A}^3 &=\l \mathbf{I}+a\mathbf{J}\r^3 \\ &= \mathbf{I}+3a\mathbf{J} + a^2\mathbf{J} + a^3\mathbf{J}^3 \\ &= \mathbf{I}+3a\mathbf{J} + 6a^2\mathbf{J} + 4a^3\mathbf{J} \\ &= \mathbf{I}+(3a+ 6a^3+4a^3)\mathbf{J} \\ &= \mathbf{I}+\frac12(1+3\cdot2a+3\dot4a^2+ 8a^3-1)\mathbf{J} \\ &= \mathbf{I}+\frac12((1+2a)^3-1)\mathbf{J} \\ \end{align*} \begin{align*} \mathbf{A}^4 &=\l \mathbf{I}+a\mathbf{J}\r^4 \\ &= \mathbf{I}+4a\mathbf{J} + 6a^2\mathbf{J}^2 + 4a^3\mathbf{J}^3+a^4\mathbf{J}^4 \\ &= \mathbf{I}+4a\mathbf{J} + 12a^2\mathbf{J} + 16a^3\mathbf{J}+8a^4\mathbf{J}\\ &= \mathbf{I}+(4a+ 12a^3+16a^3+8a^4)\mathbf{J} \\ &= \mathbf{I}+\frac12(1+4\cdot2a+6\cdot4a^2+ 4\cdot8a^3+16a^4-1)\mathbf{J} \\ &= \mathbf{I}+\frac12((1+2a)^4-1)\mathbf{J} \\ \end{align*} Claim: \(\mathbf{A}^k = \mathbf{I} + \frac12 ((1+2a)^{k}-1)\mathbf{J}\) Proof: Firstly, note that \(\mathbf{I}\) commutes with everything, so we can just apply the binomial theorem as if we were using real numbers: \begin{align*} \mathbf{A}^k &=\l \mathbf{I}+a\mathbf{J}\r^k \\ &= \sum_{i=0}^k \binom{k}{i}a^i\mathbf{J}^i \\ &= \mathbf{I} + \sum_{i=1}^k \binom{k}{i}a^i2^{i-1}\mathbf{J} \\ &= \mathbf{I} + \frac12\l\sum_{i=1}^k \binom{k}{i}a^i2^{i}\r\mathbf{J} \\ &= \mathbf{I} + \frac12\l\sum_{i=0}^k \binom{k}{i}a^i2^{i} - 1\r\mathbf{J} \\ &= \mathbf{I} + \frac12\l(1+2a)^k - 1\r\mathbf{J} \end{align*} as required
The linear transformation \(\mathrm{T}\) is a shear which transforms a point \(P\) to the point \(P'\) defined by
Solution: