7 problems found
The transformation \(R\) in the complex plane is a rotation (anticlockwise) by an angle \(\theta\) about the point represented by the complex number \(a\). The transformation \(S\) in the complex plane is a rotation (anticlockwise) by an angle \(\phi\) about the point represented by the complex number \(b\).
Solution:
The exponential of a square matrix \({\bf A}\) is defined to be $$ \exp ({\bf A}) = \sum_{r=0}^\infty {1\over r!} {\bf A}^r \,, $$ where \({\bf A}^0={\bf I}\) and \(\bf I\) is the identity matrix. Let $$ {\bf M}=\left(\begin{array}{cc} 0 & -1 \\ 1 & \phantom{-} 0 \end{array} \right) \,. $$ Show that \({\bf M}^2=-{\bf I}\) and hence express \(\exp({\theta {\bf M}})\) as a single \(2\times 2\) matrix, where \(\theta\) is a real number. Explain the geometrical significance of \(\exp({\theta {\bf M}})\). Let $$ {\bf N}=\left(\begin{array}{rr} 0 & 1 \\ 0 & 0 \end{array}\right) \,. $$ Express similarly \(\exp({s{\bf N}})\), where \(s\) is a real number, and explain the geometrical significance of \(\exp({s{\bf N}})\). For which values of \(\theta\) does $$ \exp({s{\bf N}})\; \exp({\theta {\bf M}})\, = \, \exp({\theta {\bf M}})\;\exp({s{\bf N}}) $$ for all \(s\)? Interpret this fact geometrically.
Solution: \begin{align*} \mathbf{M}^2 &= \begin{pmatrix} 0 & - 1 \\ 1 & 0 \end{pmatrix}^2 \\ &= \begin{pmatrix} 0 \cdot 0 + (-1) \cdot 1 & 0 \cdot (-1) + (-1) \cdot 0 \\ 1 \cdot 0 + 0 \cdot 1 & 1 \cdot (-1) + 0 \cdot 0 \end{pmatrix} \\ &= \begin{pmatrix} -1 & 0 \\ 0 & -1\end{pmatrix} \\ &= - \mathbf{I} \end{align*} \begin{align*} \exp(\theta \mathbf{M}) &= \sum_{r=0}^\infty \frac1{r!} (\theta \mathbf{M})^r \\ &= \sum_{r=0}^\infty \frac{1}{r!} \theta^r \mathbf{M}^r \\ &= \cos \theta \mathbf{I} + \sin \theta \mathbf{M} \\ &= \begin{pmatrix} \cos \theta & -\sin \theta \\ \sin \theta & \cos \theta \end{pmatrix} \end{align*} This is a rotation of \(\theta\) degrees about the origin. \begin{align*} && \mathbf{N}^2 &= \begin{pmatrix} 0 & 1 \\ 0 & 0 \end{pmatrix}^2 \\ && &= \begin{pmatrix} 0 & 0 \\ 0 & 0 \end{pmatrix} \\ \Rightarrow && \exp(s\mathbf{N}) &= \sum_{r=0}^\infty \frac{1}{r!} (s\mathbf{N})^r \\ &&&= \mathbf{I} + s \mathbf{N} \\ &&&= \begin{pmatrix} 1 &s \\ 0 & 1 \end{pmatrix} \end{align*} This is a shear, leaving the \(y\)-axis invariant, sending \((1,1)\) to \((1+s, 1)\). Suppose those matrices commute, for all \(s\), ie \begin{align*} && \begin{pmatrix} 1 &s \\ 0 & 1 \end{pmatrix}\begin{pmatrix} \cos \theta & -\sin \theta \\ \sin \theta & \cos \theta \end{pmatrix} &= \begin{pmatrix} \cos \theta & -\sin \theta \\ \sin \theta & \cos \theta \end{pmatrix}\begin{pmatrix} 1 &s \\ 0 & 1 \end{pmatrix} \\ \Rightarrow && \begin{pmatrix} \cos \theta - s \sin \theta & -\sin \theta + s \cos \theta \\ \sin \theta & \cos \theta \end{pmatrix} &= \begin{pmatrix} \cos \theta & s \cos \theta - \sin \theta \\ \sin \theta & s \sin \theta + \cos \theta \end{pmatrix} \\ \Rightarrow && \sin \theta &= 0 \\ \Rightarrow && \theta &=n \pi, n \in \mathbb{Z} \end{align*} Clearly it doesn't matter when we do nothing. If we are rotating by \(\pi\) then it also doesn't matter which order we do it in as the stretch happens in both directions equally.
In this question, \(\mathbf{A,\mathbf{B\) }}and \(\mathbf{X\) are non-zero \(2\times2\) real matrices.} Are the following assertions true or false? You must provide a proof or a counterexample in each case.
Solution:
The set \(S\) consists of ordered pairs of complex numbers \((z_1,z_2)\) and a binary operation \(\circ\) on \(S\) is defined by $$ (z_1,z_2)\circ(w_1,w_2)= (z_1w_1-z_2w^*_2, \; z_1w_2+z_2w^*_1). $$ Show that the operation \(\circ\) is associative and determine whether it is commutative. Evaluate \((z,0)\circ(w,0)\), \((z,0)\circ(0,w)\), \((0,z)\circ(w,0)\) and \((0,z)\circ(0,w)\). The set \(S_1\) is the subset of \(S\) consisting of \(A\), \(B\), \(\ldots\,\), \(H\), where \(A=(1,0)\), \(B=(0,1)\), \(C=(i,0)\), \(D=(0,i)\), \(E=(-1,0)\), \(F=(0,-1)\), \(G=(-i,0)\) and \(H=(0,-i)\). Show that \(S_1\) is closed under \(\circ\) and that it has an identity element. Determine the inverse and order of each element of \(S_1\). Show that \(S_1\) is a group under \(\circ\). \hfil\break [You are not required to compute the multiplication table in full.] Show that \(\{A,B,E,F\}\) is a subgroup of \(S_1\) and determine whether it is isomorphic to the group generated by the \(2\times2\) matrix $\begin{pmatrix}0 & 1\\ -1 & 0 \end{pmatrix}$ under matrix multiplication.
Let \(G\) be the set of all matrices of the form \[ \begin{pmatrix}a & b\\ 0 & c \end{pmatrix}, \] where \(a,b\) and \(c\) are integers modulo 5, and \(a\neq0\neq c\). Show that \(G\) forms a group under matrix multiplication (which may be assumed to be associative). What is the order of \(G\)? Determine whether or not \(G\) is commutative. Determine whether or not the set consisting of all elements in \(G\) of order \(1\) or \(2\) is a subgroup of \(G\).
Solution: Claim \(G\) is a group under matrix multiplication
The elements \(a,b,c,d\) belong to the group \(G\) with binary operation \(*.\) Show that
Solution: \begin{questionparts} \item \((ab)^2 = abab = e\) (since \(ab\) has order \(2\)), but \(a^2 = e, b^2 = e \Rightarrow a^{-1} = a, b^{-1} = b\) (since \(a\) and \(b\) have order 2) so \(ba = ab\) by multiplication on the left by \(a\) and right by \(b\). \item Suppose \((cd)^n = e \Leftrightarrow d(cd)^nc = dc \Leftrightarrow (dc)^n(dc) = e \Leftrightarrow (dc)^n = e\) Therefore any number for which \((cd)^n = e\) has the property that \((dc)^n = e\) and vice-versa, in particular the smallest number for either \(cd\) or \(dc\) will also be the smallest number for the other. \item Given \(c^{-1}bc=b^r\), then \(b^{rs} = (b^r)^s = (c^{-1}bc)^s =\underbrace{(c^{-1}bc)(c^{-1}bc) \cdots (c^{-1}bc)}_{s \text{ times}} = c^{-1}\underbrace{bb\cdots b}_{s \text{ times}}c = c^{-1}b^sc\) We proceed by induction on \(n\). When \(n = 0\), we have \(b^s = b^{sr^0}\) so the base case is true. Suppose it is true for some \(n = k\), ie \(c^{-k}b^sc^k = b^{sr^k}\). Now consider \(c^{-{k+1}}b^sc^{k+1} = c^{-1}c^{-k}b^sc^kc = c^{-1}b^{sr^k}c = (b^{sr^k \cdot r}) = b^{sr^{k+1}}\) (where the second to last equality was by the previous part). Therefore if our statement is true for \(n=k\) it is true for \(n = k+1\). Therefore, since it is also true for \(n=0\), by the principle of mathematical induction it is true for all non-negative integers \(n\).
The linear transformation \(\mathrm{T}\) is a shear which transforms a point \(P\) to the point \(P'\) defined by
Solution: