## Structure Theorem for finitely generated (graded) modules over a PID

If $R$ is a PID, then every finitely generated module $M$ over $R$ is isomorphic to a direct sum of cyclic $R$-modules. That is, there is a unique decreasing sequence of proper ideals $(d_1)\supseteq(d_2)\supseteq\dots\supseteq(d_m)$ such that $\displaystyle M\cong R^\beta\oplus\left(\bigoplus_{i=1}^m R/(d_i)\right)$ where $d_i\in R$, and $\beta\in\mathbb{Z}$.

Similarly, every graded module $M$ over a graded PID $R$ decomposes uniquely into the form $\displaystyle M\cong\left(\bigoplus_{i=1}^n\Sigma^{\alpha_i}R\right)\oplus\left(\bigoplus_{j=1}^m\Sigma^{\gamma_j}R/(d_j)\right)$ where $d_j\in R$ are homogenous elements such that $(d_1)\supseteq(d_2)\supseteq\dots\supseteq(d_m)$, $\alpha_i, \gamma_j\in\mathbb{Z}$, and $\Sigma^\alpha$ denotes an $\alpha$-shift upward in grading.

Let $A=\bigoplus_{i=0}^\infty A_i$ be a graded ring. An ideal $I\subset A$ is homogenous (also called graded) if for every element $x\in I$, its homogenous components also belong to $I$.

An ideal in a graded ring is homogenous if and only if it is a graded submodule. The intersections of a homogenous ideal $I$ with the $A_i$ are called the homogenous parts of $I$. A homogenous ideal $I$ is the direct sum of its homogenous parts, that is, $\displaystyle I=\bigoplus_{i=0}^\infty (I\cap A_i).$

## Universal Property of Quotient Groups (Hungerford)

If $f:G\to H$ is a homomorphism and $N$ is a normal subgroup of $G$ contained in the kernel of $f$, then $f$ “factors through” the quotient $G/N$ uniquely.

This can be used to prove the following proposition:
A chain map $f_\bullet$ between chain complexes $(A_\bullet, \partial_{A, \bullet})$ and $(B_\bullet, \partial_{B,\bullet})$ induces homomorphisms between the homology groups of the two complexes.

Proof:
The relation $\partial f=f\partial$ implies that $f$ takes cycles to cycles since $\partial\alpha=0$ implies $\partial(f\alpha)=f(\partial\alpha)=0$. Also $f$ takes boundaries to boundaries since $f(\partial\beta)=\partial(f\beta)$. Hence $f_\bullet$ induces a homomorphism $(f_\bullet)_*: H_\bullet (A_\bullet)\to H_\bullet (B_\bullet)$, by universal property of quotient groups.

For $\beta\in\text{Im} \partial_{A,n+1}$, we have $\pi_{B,n}f_n(\beta)=\text{Im}\partial_{B,n+1}$. Therefore $\text{Im}\partial_{A,n+1}\subseteq\ker(\pi_{B,n}\circ f_n)$.

## Existence and properties of normal closure

If $E$ is an algebraic extension field of $K$, then there exists an extension field $F$ of $E$ (called the normal closure of $E$ over $K$) such that
(i) $F$ is normal over $K$;
(ii) no proper subfield of $F$ containing $E$ is normal over $K$;
(iii) if $E$ is separable over $K$, then $F$ is Galois over $K$;
(iv) $[F:K]$ is finite if and only if $[E:K]$ is finite.

The field $F$ is uniquely determined up to an $E$-isomorphism.

Proof:
(i) Let $X=\{u_i\mid i\in I\}$ be a basis of $E$ over $K$ and let $f_i\in K[x]$ be the minimal polynomial of $u_i$. If $F$ is a splitting field of $S=\{f_i\mid i\in I\}$ over $E$, then $F=E(Y)$, where $Y\supseteq X$ is the set of roots of the $f_i$. Then $F=K(X)(Y)=K(Y)$ so $F$ is also a splitting field of $S$ over $K$, hence $F$ is normal over $K$ as it is the splitting field of a family of polynomials in $K[x]$.

(iii) If $E$ is separable over $K$, then each $f_i$ is separable. Therefore $F$ is Galois over $K$ as it is a splitting field over $K$ of a set of separable polynomials in $K[x]$.

(iv) If $[E:K]$ is finite, then so is $X$ and hence $S$. Say $S=\{f_1,\dots,f_n\}$. Then $F=E(Y)$, where $Y$ is the set of roots of the $f_i$. Then $F$ is finitely generated and algebraic, thus a finite extension. So $[F:K]$ is finite.

(ii) A subfield $F_0$ of $F$ that contains $E$ necessarily contains the root $u_i$ of $f_i\in S$ for every $i$. If $F_0$ is normal over $K$ (so that each $f_i$ splits in $F_0$ by definition), then $F\subset F_0$ (since $F$ is the splitting field) and hence $F=F_0$.

Finally let $F_1$ be another extension field of $E$ with properties (i) and (ii). Since $F_1$ is normal over $K$ and contains each $u_i$, $F_1$ must contain a splitting field $F_2$ of $S$ over $K$ with $E\subset F_2$. $F_2$ is normal over $K$ (splitting field over $K$ of family of polynomials in $K[x]$), hence $F_2=F_1$ by (ii).

Therefore both $F$ and $F_1$ are splitting fields of $S$ over $K$ and hence of $S$ over $E$: If $F=K(Y)$ (where $Y$ is set of roots of $f_i$) then $F\subseteq E(Y)$ since $E(Y)$ contains $K$ and $Y$. Since $Y\supseteq X$, so $K(Y)$ contains $E=K(X)$ and $Y$, hence $F=E(Y)$. Hence the identity map on $E$ extends to an $E$-isomorphism $F\cong F_1$.

## A finitely generated torsion-free module A over a PID R is free

A finitely generated torsion-free module $A$ over a PID $R$ is free.
Proof
(Hungerford 221)

If $A=0$, then $A$ is free of rank 0. Now assume $A\neq 0$. Let $X$ be a finite set of nonzero generators of $A$. If $x\in X$, then $rx=0$ ($r\in R$) if and only if $r=0$ since $A$ is torsion-free.

Consequently, there is a nonempty subset $S=\{x_1,\dots,x_k\}$ of $X$ that is maximal with respect to the property: $\displaystyle r_1x_1+\dots+r_kx_k=0\ (r_i\in R) \implies r_i=0\ \text{for all}\ i.$

The submodule $F$ generated by $S$ is clearly a free $R$-module with basis $S$. If $y\in X-S$, then by maximality there exist $r_y,r_1,\dots,r_k\in R$, not all zero, such that $r_yy+r_1x_1+\dots+r_kx_k=0$. Then $r_yy=-\sum_{i=1}^kr_ix_i\in F$. Furthermore $r_y\neq 0$ since otherwise $r_i=0$ for every $i$.

Since $X$ is finite, there exists a nonzero $r\in R$ (namely $r=\prod_{y\in X-S}r_y$) such that $rX=\{rx\mid x\in X\}$ is contained in $F$:

If $y_i\in X-S$, then $ry=r_{y_1}\dots r_{y_n}y_i\in F$ since $r_{y_i}y_i\in F$. If $x\in S$, then clearly $rx\in F$ since $F$ is generated by $S$.

Therefore, $rA=\{ra\mid a\in A\}\subset F$. The map $f:A\to A$ given by $a\mapsto ra$ is an $R$-module homomorphism with image $rA$. Since $A$ is torsion-free $\ker f=0$, hence $A\cong rA\subset F$. Since a submodule of a free module over a PID is free, this proves $A$ is free.

## Tensor is a right exact functor Elementary Proof

This is a relatively elementary proof (compared to others out there) of the fact that tensor is a right exact functor. Proof is taken from Hungerford, and reworded slightly. The key prerequisites needed are the universal property of quotient and of tensor product.

## Statement

If $A\xrightarrow{f}B\xrightarrow{g}C\to 0$ is an exact sequence of left modules over a ring $R$ and $D$ is a right $R$-module, then $\displaystyle D\otimes_R A\xrightarrow{1_D\otimes f}D\otimes_R B\xrightarrow{1_D\otimes g}D\otimes_R C\to 0$ is an exact sequence of abelian groups. An analogous statement holds for an exact sequence in the first variable.

## Proof

(Hungerford 210)

We split our proof into 3 parts: (i) $\text{Im}(1_D\otimes g)=D\otimes_R C$; (ii) $\text{Im}(1_D\otimes f)\subseteq\text{Ker}(1_D\otimes g)$; and (iii) $\text{Ker}(1_D\otimes g)\subseteq\text{Im}(1_D\otimes f)$.

(i) Since $g$ is an epimorphism by hypothesis every generator $d\otimes c$ of $D\otimes_R C$ is of the form $d\otimes g(b)=(1_D\otimes g)(d\otimes b)$ for some $b\in B$. Thus $\text{Im}(1_D\otimes g)$ contains all generators of $D\otimes_R C$, hence $\text{Im}(1_D\otimes g)=D\otimes_R C$.

(ii) Since $\text{Ker} g=\text{Im} f$ we have $gf=0$ and $\displaystyle (1_D\otimes g)(1_D\otimes f)=1_D\otimes gf=1_D\otimes 0=0,$ hence $\text{Im}(1_D\otimes f)\subseteq\text{Ker}(1_D\otimes g)$.

(iii) Let $\pi:D\otimes_R B\to(D\otimes_R B)/\text{Im}(1_D\otimes f)$ be the canonical epimorphism. From (ii), $\text{Im}(1_D\otimes f)\subseteq\text{Ker}(1_D\otimes g)$ so (by universal property of quotient Theorem 1.7) there is a homomorphism $\alpha:(D\otimes_R B)/\text{Im}(1_D\otimes f)\to D\otimes_R C$ such that $\displaystyle \alpha(\pi(d\otimes b))=(1_D\otimes g)(d\otimes b)=d\otimes g(b).$ We shall show that $\alpha$ is an isomorphism. Then $\text{Ker}(1_D\otimes g)=\text{Im}(1_D\otimes f)$.

We show first that the map $\beta:D\times C\to(D\otimes_R B)/\text{Im}(1_D\otimes f)$ given by $(d,c)\mapsto\pi(d\otimes b)$, where $g(b)=c$, is independent of the choice of $b$. Note that there is at least one such $b$ since $g$ is an epimorphism. If $g(b')=c$, then $g(b-b')=0$ and $b-b'\in\text{Ker} g=\text{Im} f$, hence $b-b'=f(a)$ for some $a\in A$. Since $d\otimes f(a)\in\text{Im}(1_D\otimes f)$ and $\pi(d\otimes f(a))=0$, we have
\begin{aligned} \pi(d\otimes b)&=\pi(d\otimes(b'+f(a))\\ &=\pi(d\otimes b'+d\otimes f(a))\\ &=\pi(d\otimes b')+\pi(d\otimes f(a))\\ &=\pi(d\otimes b'). \end{aligned}

Therefore $\beta$ is well-defined.

Verify that $\beta$ is middle linear:
\begin{aligned} \beta(d_1+d_2,c)&=\pi((d_1+d_2)\otimes b)\qquad\text{where }g(b)=c\\ &=\pi(d_1\otimes b+d_2\otimes b)\\ &=\pi(d_1\otimes b)+\pi(d_2\otimes b)\\ &=\beta(d_1,c)+\beta(d_2,c). \end{aligned}

\begin{aligned} \beta(d,c_1+c_2)&=\pi(d\otimes(b_1+b_2))\qquad\text{where }g(b_i)=c_i\\ &=\pi(d\otimes b_1+d\otimes b_2)\\ &=\pi(d\otimes b_1)+\pi(d\otimes b_2)\\ &=\beta(d,c_1)+\beta(d,c_2). \end{aligned}

Let $r\in R$.
\begin{aligned} \beta(dr,c)&=\pi(dr\otimes b)\qquad\text{where }g(b)=c\\ &=\pi(d\otimes rb)\\ &=\beta(d,rc)\qquad\text{where }g(rb)=rg(b)=rc. \end{aligned}

By universal property of tensor product there exists a unique homomorphism $\bar{\beta}:D\otimes_R C\to(D\otimes_R B)/\text{Im}(1_D\otimes f)$ such that $\bar{\beta}(d\otimes c)=\beta(d,c)=\pi(d\otimes b)$, where $g(b)=c$.

Therefore, for any generator $d\otimes c$ of $D\otimes_R C$, $\displaystyle \alpha\bar{\beta}(d\otimes c)=\alpha\pi(d\otimes b)=d\otimes g(b)=d\otimes c,$ hence $\alpha\bar{\beta}$ is the identity map.

Similarly
\begin{aligned} \bar{\beta}\alpha(d\otimes b+\text{Im}(1_D\otimes f))&=\bar{\beta}\alpha\pi(d\otimes b)\\ &=\bar{\beta}(d\otimes g(b))\\ &=\pi(d\otimes b)\\ &=d\otimes b+\text{Im}(1_D\otimes F) \end{aligned}
so $\bar{\beta}\alpha$ is the identity so that $\alpha$ is an isomorphism.

## Note on Finitely Generated Abelian Groups

We state and prove a sufficient condition for finitely generated Abelian Groups to be the direct product of its generators, and state a counterexample to the conclusion when the condition is not satisfied.

## Theorem

Let $G$ be an abelian group and $G=\langle g_1,\dots, g_n\rangle$.

Suppose the generators $g_1,\dots,g_n$ are linearly independent over $\mathbb{Z}$, that is, whenever $c_1g_1+\dots+c_ng_n=0$ for some integers $c_i\in\mathbb{Z}$, we have $c_1=\dots=c_n=0$.

(Here we are using additive notation for $(G,+)$, where the identity of $G$ is written as 0, the inverse of $g$ is written as $-g$).

Then $\displaystyle G\cong\langle g_1\rangle\times\dots\times\langle g_n\rangle.$

## Proof

Define the following map $\psi:\langle g_1\rangle\times\dots\times\langle g_n\rangle\to\langle g_1,\dots,g_n\rangle$ by $\displaystyle \psi((c_1g_1,\dots,c_ng_n))=c_1g_1+\dots+c_ng_n.$

We can check that $\psi$ is a group homomorphism.

We have that $\psi$ is surjective since any element $x\in\langle g_1,\dots,g_n\rangle$ is by definition a combination of finitely many elements of the generating set and their inverses. Since $G$ is abelian, $x=c_1g_1+\dots+c_ng_n$ for some $c_i\in\mathbb{Z}$.

Also, $\psi$ is injective since if $c_1g_1+\dots+c_ng_n=0$, then all the coefficients $c_i$ are zero (by the linear independence condition). Thus $\ker\psi$ is trivial.

Hence $\psi$ is an isomorphism.

## Remark

Note that without the linear independence condition, the conclusion may not be true. Consider $G=\mathbb{Z}_2\times\mathbb{Z}_3\times\mathbb{Z}_5$ which is abelian with order 30. Consider $g_1=(1,1,0)$, $g_2=(0,1,1)$.

We can see that $G=\langle g_1,g_2\rangle$, by observing that $3g_1=(1,0,0)$, $4g_1=(0,1,0)$, $2g_1+g_2=(0,0,1)$. However $\langle g_1\rangle\times\langle g_2\rangle=\mathbb{Z}_6\times\mathbb{Z}_{15}$ has order 90. Thus $\langle g_1,g_2\rangle\not\cong\langle g_1\rangle\times\langle g_2\rangle$.

## Gauss Lemma Proof

There are two related results that are commonly called “Gauss Lemma”. The first is that the product of primitive polynomial is still primitive. The second result is that a primitive polynomial is irreducible over a UFD (Unique Factorization Domain) D, if and only if it is irreducible over its quotient field.

## Gauss Lemma: Product of primitive polynomials is primitive

If $D$ is a unique factorization domain and $f,g\in D[x]$, then $C(fg)=C(f)C(g)$. In particular, the product of primitive polynomials is primitive.

## Proof

(Hungerford pg 163)

Write $f=C(f)f_1$ and $g=C(g)g_1$ with $f_1$, $g_1$ primitive. Consequently $\displaystyle C(fg)=C(C(f)f_1C(g)g_1)\sim C(f)C(g)C(f_1g_1).$

Hence it suffices to prove that $f_1g_1$ is primitive, that is, $C(f_1g_1)$ is a unit. If $f_1=\sum_{i=0}^n a_ix^i$ and $g_1=\sum_{j=0}^m b_jx^j$, then $f_1g_1=\sum_{k=0}^{m+n}c_kx^k$ with $c_k=\sum_{i+j=k}a_ib_j$.

If $f_1g_1$ is not primitive, then there exists an irreducible element $p$ in $D$ such that $p\mid c_k$ for all $k$. Since $C(f_1)$ is a unit $p\nmid C(f_1)$, hence there is a least integer $s$ such that $\displaystyle p\mid a_i\ \text{for}\ i

Similarly there is a least integer $t$ such that $\displaystyle p\mid b_j\ \text{for}\ j

Since $p$ divides $\displaystyle c_{s+t}=a_0b_{s+t}+\dots+a_{s-1}b_{t+1}+a_sb_t+a_{s+1}b_{t-1}+\dots+a_{s-t}b_0,$ $p$ must divide $a_sb_t$. Since every irreducible element in $D$ (UFD) is prime, $p\mid a_s$ or $p\mid b_t$. This is a contradiction. Therefore $f_1g_1$ is primitive.

## Primitive polynomials are associates in $D[x]$ iff they are associates in $F[x]$

Let $D$ be a unique factorization domain with quotient field $F$ and let $f$ and $g$ be primitive polynomials in $D[x]$. Then $f$ and $g$ are associates in $D[x]$ if and only if they are associates in $F[x]$.

## Proof

($\impliedby$) If $f$ and $g$ are associates in the integral domain $F[x]$, then $f=gu$ for some unit $u\in F[x]$. Since the units in $F[x]$ are nonzero constants, so $u\in F$, hence $u=b/c$ with $b,c\in D$ and $c\neq 0$. Thus $cf=bg$.

Since $C(f)$ and $C(g)$ are units in $D$, $\displaystyle c\sim cC(f)\sim C(cf)=C(bg)\sim bC(g)\sim b.$

Therefore, $b=cv$ for some unit $v\in D$ and $cf=bg=vcg$. Consequently, $f=vg$ (since $c\neq 0$), hence $f$ and $g$ are associates in $D[x]$.

($\implies$) Clear, since if $f=gu$ for some $u\in D[x]\subseteq F[x]$, then $f$ and $g$ are associates in $F[x]$.

## Primitive $f$ is irreducible in $D[x]$ iff $f$ is irreducible in $F[x]$

Let $D$ be a UFD with quotient field $F$ and $f$ a primitive polynomial of positive degree in $D[x]$. Then $f$ is irreducible in $D[x]$ if and only if $f$ is irreducible in $F[x]$.

## Proof

($\implies$) Suppose $f$ is irreducible in $D[x]$ and $f=gh$ with $g,h\in F[x]$ and $\deg g\geq 1$, $\deg h\geq 1$. Then $g=\sum_{i=0}^n(a_i/b_i)x^i$ and $h=\sum_{j=0}^m(c_j/d_j)x^j$ with $a_i, b_i, c_j, d_j\in D$ and $b_i\neq 0$, $d_j\neq 0$.

Let $b=b_0b_1\dots b_n$ and for each $i$ let $\displaystyle b_i^*=b_0b_1\dots b_{i-1}b_{i+1}\dots b_n.$ If $g_1=\sum_{i=0}^n a_ib_i^* x^i\in D[x]$ (clear denominators of $g$ by multiplying by product of denominators), then $g_1=ag_2$ with $a=C(g_1)$, $g_2\in D[x]$ and $g_2$ primitive.

Verify that $g=(1_D/b)g_1=(a/b)g_2$ and $\deg g=\deg g_2$. Similarly $h=(c/d)h_2$ with $c,d\in D$, $h_2\in D[x]$, $h_2$ primitive and $\deg h=\deg h_2$. Consequently, $f=gh=(a/b)(c/d)g_2h_2$, hence $bdf=acg_2h_2$. Since $f$ is primitive by hypothesis and $g_2h_2$ is primitive by Gauss Lemma, $\displaystyle bd\sim bdC(f)\sim C(bdf)=C(acg_2h_2)\sim acC(g_2h_2)\sim ac.$

This means $bd$ and $ac$ are associates in $D$. Thus $ubd=ac$ for some unit $u\in D$. So $f=ug_2h_2$, hence $f$ and $g_2h_2$ are associates in $D[x]$. Consequently $f$ is reducible in $D[x]$ (since $f=ug_2h_2$), which is a contradiction. Therefore, $f$ is irreducible in $F[x]$.

($\impliedby$) Conversely if $f$ is irreducible in $F[x]$ and $f=gh$ with $g,h\in D[x]$, then one of $g$, $h$ (say $g$) is a unit in $F[x]$ and thus a (nonzero) constant. Thus $C(f)=gC(h)$. Since $f$ is primitive, $g$ must be a unit in $D$ and hence in $D[x]$. Thus $f$ is irreducible in $D[x]$.

## Non-trivial submodules of direct sum of simple modules

Suppose $M_1$ and $M_2$ are two non-isomorphic simple, nonzero $R$-modules.

Determine all non-trivial submodules of $M_1\oplus M_2$.

Let $N$ be a non-trivial submodule of $M_1\oplus M_2$. Note that $\displaystyle \{0\}\subset M_1\subset M_1\oplus M_2$ is a composition series. By Jordan-Holder theorem, all composition series are equivalent and have the same length. Hence $\displaystyle \{0\}\subset N\subset M_1\oplus M_2$ must be a composition series too.

Thus $N\cong M_1$ or $M_2$. In particular $N$ is simple.

Let $\pi_1: M_1\oplus M_2\to M_1$ and $\pi_2:M_1\oplus M_2\to M_2$ be the canonical projections. Note that $\pi_1(N)$ is a submodule of $M_1$, so $\pi_1(N)\cong 0$ or $M_1$. Similarly, $\pi_2(N)\cong 0$ or $M_2$.

By Schur’s Lemma $\pi_1|_N: N\to \pi_1(N)$ and $\pi_2|_N: N\to\pi_2(N)$ are either 0 or isomorphisms.

They cannot be both zero since $N$ is non-zero. They cannot be both isomorphisms either, as that would imply $M_1\cong\pi_1(N)\cong\pi_2(N)\cong M_2$.

Hence, exactly one of $\pi_1$, $\pi_2$ are zero. So $N=M_1\oplus\{0\}$ or $\{0\}\oplus M_2$.

# Commutator subgroup $G'$ is the unique smallest normal subgroup $N$ such that $G/N$ is abelian.

If $G$ is a group, then $G'$ is a normal subgroup of $G$ and $G/G'$ is abelian. If $N$ is a normal subgroup of $G$, then $G/N$ is abelian iff $N$ contains $G'$.

## Proof

Let $f:G\to G$ be any automorphism. Then $\displaystyle f(aba^{-1}b^{-1})=f(a)f(b)f(a)^{-1}f(b)^{-1}\in G'.$

It follows that $f(G')\leq G'$. In particular, if $f$ is the automorphism given by conjugation by $a\in G$, then $aG'a^{-1}=f(G')\leq G'$, so $G'\unlhd G$.

Since $(ab)(ba)^{-1}=aba^{-1}b^{-1}\in G'$, $abG'=baG'$ and hence $G/G'$ is abelian.

($\implies$) If $G/N$ is abelian, then $abN=baN$ for all $a,b\in G$. Hence $ab(ba)^{-1}=aba^{-1}b^{-1}\in N$. Therefore, $N$ contains all commutators and $G'\leq N$.

($\impliedby$) If $G'\subseteq N$, then $ab(ba)^{-1}=aba^{-1}b^{-1}\in G'\subseteq N$. Thus $abN=baN$ for all $a,b\in G$. Hence $G/N$ is abelian.

## Ascending Central Series of $G$

Let $G$ be a group. The center $C(G)$ of $G$ is a normal subgroup. Let $C_2(G)$ be the inverse image of $C(G/C(G))$ under the canonical projection $G\to G/C(G)$. By Correspondence Theorem, $C_2(G)$ is normal in $G$ and contains $C(G)$.

Continue this process by defining inductively: $C_1(G)=C(G)$ and $C_i(G)$ is the inverse image of $C(G/C_{i-1}(G))$ under the canonical projection $G\to G/C_{i-1}(G)$.

Thus we obtain a sequence of normal subgroups of $G$, called the ascending central series of $G$: $\displaystyle \langle e\rangle

## Nilpotent Group

A group $G$ is nilpotent if $C_n(G)=G$ for some $n$.

## Abelian Group is Nilpotent

Every abelian group $G$ is nilpotent since $G=C(G)=C_1(G)$.

## Every finite $p$-group is nilpotent (Proof)

$G$ and all its nontrivial quotients are $p$-groups, and therefore have non-trivial centers.

Hence if $G\neq C_i(G)$, then $G/C_i(G)$ is a $p$-group, and $C(G/C_i(G))$ is non-trivial. Thus $C_{i+1}(G)$, the inverse image of $C(G/C_i(G))$ under $\pi:G\to G/C_i(G)$, strictly contains $C_i(G)$.

Since $G$ is finite, $C_n(G)$ must be $G$ for some $n$.

## Existence of Splitting Field with degree less than n!

If $K$ is a field and $f\in K[X]$ has degree $n\geq 1$, then there exists a splitting field $F$ of $f$ with $[F:K]\leq n!$.

Proof:

We use induction on $n=\deg f$.

Base case: If $n=1$, or if $f$ splits over $K$, then $F=K$ is a splitting field with $[F:K]=1\leq 1!$.

Induction Hypothesis: Assume the statement is true for degree $n-1$, where $n>1$.

If $n=\deg f>1$ and $f$ does not split over $K$, let $g\in K[X]$ be an irreducible factor of $f$ with $\deg g>1$. Let $u$ be a root of $g$, then $\displaystyle [K(u):K]=\deg g>1.$

Write $f=(x-u)h$ with $h\in K(u)[X]$ of degree $n-1$. By induction hypothesis, there exists a splitting field $F$ of $h$ over $K(u)$ with $[F:K(u)]\leq(n-1)!$.

That is, $h=u_0(x-u_1)\dots(x-u_{n-1})$ with $u_i\in F$ and $F=K(u)(u_1,\dots,u_{n-1})=K(u,u_1,\dots,u_{n-1})$. Thus $f=u_0(x-u)(x-u_1)\dots(x-u_{n-1})$, so $f$ splits over $F$.

This shows $F$ is a splitting field of $f$ over $K$ of dimension
\begin{aligned} [F:K]&=[F:K(u)][K(u):K]\\ &\leq (n-1)!(\deg g)\\ &\leq n! \end{aligned}

## Counterexamples to Normal Extension

Let $K\subseteq L\subseteq M$ be a tower of fields.

Q1) If M/K is a normal extension, is L/K a normal extension?

False. Let $M$ be the algebraic closure of $K=\mathbb{Q}$. Let $L=\mathbb{Q}(\sqrt[3]{2})$.

Then $M$ is certainly a normal extension of $\mathbb{Q}$ since every irreducible polynomial in $\mathbb{Q}[X]$ that has one root in $M$ has all of its roots in $M$.

However consider $X^3-2\in\mathbb{Q}[X]$. It has one root $(\sqrt[3]{2})$ in $L$, but the other two complex roots are not in $L$. Thus $L/K$ is not a normal extension.

Q2) If M/L and L/K are both normal extensions, is M/K a normal extension? (i.e. is normal extension transitive?)

False. Let $L=\mathbb{Q}(\sqrt 2)$, $K=\mathbb{Q}$. Then $L/K$ is normal since $L$ is the splitting field of $X^2-2$ over $\mathbb{Q}$.

Let $M=\mathbb{Q}(\sqrt 2,\sqrt[4]{2})$. Then $M/L$ is normal since $M$ is the splitting field of $X^2-\sqrt 2$ over $L$.

However, $M/K$ is not normal. The polynomial $X^4-2$ has a root in $M$ (namely $\pm\sqrt[4]{2}$) but the other two complex roots are not in $M$.

## Conditions for S^-1I=S^-1R (Ring of quotients)

Conditions for $S^{-1}I=S^{-1}R$:
Let $S$ be a multiplicative subset of a commutative ring $R$ with identity and let $I$ be an ideal of $R$. Then $S^{-1}I=S^{-1}R$ if and only if $S\cap I\neq\varnothing$.

Proof
(H pg 146)

$(\implies)$ Assume $S^{-1}I=S^{-1}R$. Consider the ring homomorphism $\displaystyle \phi_S: R\to S^{-1}R$ given by $r\mapsto rs/s$ (for any $s\in S$). Then $\phi_S^{-1}(S^{-1}I)=R$ hence $\phi_S(1_R)=a/s$ for some $a\in I$, $s\in S$. Since $\phi_S(1_R)=1_Rs/s$, we have $s_1(s^2-as)=0$ for some $s_1\in S$, i.e.\ $s^2s_1=ass_1$. But $s^2s_1\in S$ and $ass_1\in I$ imply $S\cap I\neq\varnothing$.

$(\impliedby)$ If $s\in S\cap I$, then $1_{S^{-1}R}=s/s\in S^{-1}I$. Note that for any $r/s\in S^{-1}R$, $\displaystyle (\frac{r}{s})(\frac{s}{s})=\frac{rs}{s^2}=\frac{r}{s}\in S^{-1}I$ since $rs\in I$ and $s^2\in S$. Thus $S^{-1}I=S^{-1}R$.

## Local Ring Equivalent Conditions

If $R$ is a commutative ring with 1 then the following conditions are equivalent.
(i) $R$ is a local ring, that is, a commutative ring with 1 which has a unique maximal ideal.
(ii) All nonunits of $R$ are contained in some ideal $M\neq R$.
(iii) The nonunits of $R$ form an ideal.

Proof
(H pg 147)

(i)$\implies$(ii): If $a\in R$ is a nonunit, then $(a)\neq R$ since $1\notin (a)$. Therefore $(a)$ (and hence $a$) is contained in the unique maximal ideal $M$ of $R$, since $M$ must contain every ideal of $R$ (except $R$ itself).

(ii)$\implies$(iii): Let $S$ be the set of all nonunits of $R$. We have $S\subseteq M\neq R$. Let $x\in M$. Since $M\neq R$, $x$ cannot be a unit. So $x\in S$. Thus $M\subseteq S$. Hence $S=M$, which is an ideal.

(iii)$\implies$(i): Assume $S$, the set of nonunits, form an ideal. Let $I\neq R$ be a maximal ideal. Let $a\in I$, then $a$ cannot be a unit so $a\in S$. Thus $I\subseteq S\neq R$. By maximality $S=I$ and this shows $S$ is the unique maximal ideal.

## Normalizer of Normalizer of Sylow p-subgroup

The normalizer of a Sylow p-subgroup is “self-normalizing”, i.e. its normalizer is itself. Something that is quite cool.

If $P$ is a Sylow $p$-subgroup of a finite group $G$, then $N_G(N_G(P))=N_G(P)$.

Proof

Let $N=N_G(P)$. Let $x\in N_G(N)$, so that $xNx^{-1}=N$. Then $xPx^{-1}$ is a Sylow $p$-subgroup of $N\leq G$. Since $P$ is normal in $N$, $P$ is the only Sylow $p$-subgroup of $N$. Therefore $xPx^{-1}=P$. This implies $x\in N$. We have proved $N_G(N_G(P))\subseteq N_G(P)$.

Let $y\in N_G(P)$ Then certainly $yN_G(P)y^{-1}=N_G(P)$, so that $y\in N_G(N_G(P))$. Thus $N_G(P)\subseteq N_G(N_G(P))$.

## Index of smallest prime dividing $latex |G|$ implies Normal Subgroup

I have previously proved this at: Advanced Method for Proving Normal Subgroup. This is a neater, slightly shorter proof of the same theorem.

Index of smallest prime dividing $|G|$ implies Normal Subgroup
If $H$ is a subgroup of a finite group $G$ of index $p$, where $p$ is the smallest prime dividing the order of $G$, then $H$ is normal in $G$.

Proof:
(Hungerford pg 91)

Let $G$ act on the set $G/H$ (left cosets of $H$ in $G$) by left translation.

This induces a homomorphism $\sigma: G\to S_{G/H}\cong S_p$, where $\sigma_g(xH)=gxH$. Let $g\in\ker\sigma$. Then $gxH=xH$ for all $xH\in G/H$. In particular, when $x=1$, $gH=H$ which implies $g\in H$. So we have $\ker\sigma\subseteq H$.

Let $K=\ker\sigma$. By First Isomorphism Theorem, $G/K\cong\text{Im}\,\sigma\leq S_p$. Hence $|G/K|$ divides $|S_p|=p!$ But every divisor of $|G/K|=[G:K]$ must divide $|G|=|K|[G:K]$. Since no number smaller than $p$ (except 1) can divide $|G|$, we must have $|G/K|=p$ or $1$. However $\displaystyle |G/K|=[G:K]=[G:H][H:K]=p[H:K]\geq p.$

Therefore $|G/K|=p$ and $[H:K]=1$, hence $H=K$. But $K=\ker\sigma$ is normal in $G$.

## Normal Extension

An algebraic field extension $L/K$ is said to be normal if $L$ is the splitting field of a family of polynomials in $K[X]$.

Equivalent Properties
The normality of $L/K$ is equivalent to either of the following properties. Let $K^a$ be an algebraic closure of $K$ containing $L$.

1) Every embedding $\sigma$ of $L$ in $K^a$ that restricts to the identity on $K$, satisfies $\sigma(L)=L$. In other words, $\sigma$, is an automorphism of $L$ over $K$.
2) Every irreducible polynomial in $K[X]$ that has one root in $L$, has all of its roots in $L$, that is, it decomposes into linear factors in $L[X]$. (One says that the polynomial splits in $L$.)

## Some Linear Algebra Theorems

Linear Algebra

Diagonalizable & Minimal Polynomial:
A matrix or linear map is diagonalizable over the field $F$ if and only if its minimal polynomial is a product of distinct linear factors over $F$.

Characteristic Polynomial:
Let $A$ be an $n\times n$ matrix. The characteristic polynomial of $A$, denoted by $p_A(t)$, is the polynomial defined by $\displaystyle p_A(t)=\det(tI-A).$

Cayley-Hamilton Theorem:
Every square matrix over a commutative ring satisfies its own characteristic equation:

If $A$ is an $n\times n$ matrix, $p(A)=0$ where $p(\lambda)=\det(\lambda I_n-A)$.

## Image and Preimage of Sylow p-subgroups under Epimorphism

Suppose G and H are p-groups, and $\phi:G\to H$ is a surjective homomorphism.

Then for any Sylow p-subgroup P of G, $\phi(P)$ is a Sylow p-subgroup of H.

Conversely, for any Sylow p-subgroup Q of H, $Q=\phi(P)$ for some Sylow p-subgroup P of G.

Proof:

By the First Isomorphism Theorem, $G/\ker\phi\cong\phi(G)=H$. Write $N=\ker\phi$. Then $\phi(P)=\{pN:p\in P\}=PN/N$.

Since $P$ is a Sylow $p$-subgroup of $G$, $[G:P]$ is relatively prime to $p$. Thus, $[G:PN]=[G:P]/[PN:P]$ is also relatively prime to $p$.

Then $\displaystyle [H:\phi(P)]=[G/N:PN/N]=[G:PN]$ is also relatively prime to $p$. Since $\phi(P)\cong P/\ker\phi|_P$, $\phi(P)$ is a $p$-group, so $\phi(P)$ is a Sylow $p$-subgroup of $H$.

Part 2: Let $Q$ be a Sylow $p$-subgroup of $H\cong G/N$. Then by Correspondence Theorem, $Q\cong K/N$ for some subgroup $K$ with $N\subseteq K\subseteq G$.

Then, $[G:K]=[H:Q]$ is relatively prime to $p$, so $K$ contains a Sylow $p$-subgroup $P$.

Consider $P/N\cong\phi(P)\subseteq Q\cong K/N$. By previous part, $\phi(P)$ is a Sylow $p$-subgroup of $H$, so $\phi(P)=Q$.

## Aut(G)=Aut(H)xAut(K), where H, K are characteristic subgroups of G with trivial intersection

Let G=HK, where H, K are characteristic subgroups of G with trivial intersection, i.e. $H\cap K=\{e\}$. Then, $Aut(G)=Aut(H)\times Aut(K)$.

Proof:

Now suppose $G=HK$, where $H$ and $K$ are characteristic subgroups of $G$ with $H\cap K=\{e\}$. Define $\Psi:\text{Aut}(G)\to\text{Aut}(H)\times\text{Aut}(K)$ by $\displaystyle \Psi(\sigma)=(\sigma|_H, \sigma|_K).$

$\sigma|_H:H\to H$ is a homomorphism, and bijective since $\sigma|_H(H)=H$. Thus $\sigma|_H\in\text{Aut}(H)$ and similarly, $\sigma|_K\in\text{Aut}(K)$ so that $\Psi$ is well-defined.

Note that $\displaystyle \Psi(\sigma_1\sigma_2)=((\sigma_1\sigma_2)|_H, (\sigma_1\sigma_2)|_K)=(\sigma_1|_H,\sigma_1|_K)(\sigma_2|_H,\sigma_2|_K)=\Psi(\sigma_1)\Psi(\sigma_2)$ so $\Psi$ is a homomorphism.

Suppose $\sigma\in\ker\Psi$. Then $\Psi(\sigma)=(\sigma|_H,\sigma|_K)=(\text{id}_H,\text{id}_K)$. Then for $hk\in G$, $\sigma(hk)=\sigma(h)\sigma(k)=hk$ so that $\sigma=\text{id}_G$. Thus $\Psi$ is injective.

For any $(\phi, \psi)\in\text{Aut}(H)\times\text{Aut}(K)$, define $\sigma(hk)=\phi(h)\psi(k)$. Then
\begin{aligned} \sigma(h_1k_1h_2k_2)&=\sigma(h_1h_2k_1k_2)\\ &\text{(}H, K\ \text{normal and}\ H\cap K=\{e\}\ \text{implies elements of}\ H, K\ \text{commute)}\\ &=\phi(h_1h_2)\psi(k_1k_2)\\ &=\phi(h_1)\phi(h_2)\psi(k_1)\psi(k_2)\\ &=\phi(h_1)\psi(k_1)\phi(h_2)\psi(k_2)\\ &=\sigma(h_1k_1)\sigma(h_2k_2). \end{aligned}
So $\sigma$ is a homomorphism.

If $hk\in\ker\sigma$, then $\phi(h)\psi(k)=e$, so that $\phi(h)=(\psi(k))^{-1}$. Then since $H\cap K=\{e\}$, so $\phi(h)=\psi(k)=e$, so that $h=k=e$. Thus $\ker\sigma=\{e\}$ and $\sigma$ is injective.

Any $h\in H$ can be written as $\phi(h')$ since $\phi$ is bijective. Similarly, any $k\in K$ can be written as $\psi(k')$. Then $\sigma(h'k')=\phi(h')\psi(k')=hk$ so $\sigma$ is surjective.

Thus $\sigma\in\text{Aut}(G)$. Note that $\sigma|_H=\phi$ since $\sigma|_H(h)=\sigma(h\cdot 1)=\phi(h)\psi(1)=\phi(h)$. Similarly, $\sigma|_K=\psi$. So $\Psi(\sigma)=(\phi,\psi)$ and $\Psi$ is surjective.

Hence $\Psi$ is an isomorphism.

## How to Remember the 8 Vector Space Axioms

Vector Space has a total of 8 Axioms, most of which are common-sense, but can still pose a challenge for memorizing by heart.

I created a mnemonic “MAD” which helps to remember them.

M for Multiplicative Axioms:

1. $1x=x$ (Scalar Multiplication identity)
2. $(ab)x=a(bx)$ (Associativity of Scalar Multiplication)

A for Additive Axioms: (Note that these are precisely the axioms for an abelian group)

1. $x+y=y+x$ (Commutativity)
2. $(x+y)+z=x+(y+z)$ (Associativity for Vector Addition)
3. $x+(-x)=0$ (Existence of Additive Inverse)
4. $x+0=0+x=0$ (Additive Identity)

D for Distributive Axioms:

1. $a(x+y)=ax+ay$ (Distributivity of vector sums)
2. $(a+b)x=ax+bx$ (Distributivity of scalar sums)

## How to Remember the 10 Field Axioms

There are a total of 10 Axioms for Field, it can be quite a challenge to remember all 10 of them offhand.

I created a mnemonic “ACIDI” to remember the 10 axioms. Unfortunately it is not a real word, but is close to the word “acidic”. A picture to remember is “acidic field”, a grass field polluted by acid rain?! 😛

A: Associativity
C: Commutativity
I: Identity
D: Distributivity
I: Inverse

Each of the properties has two parts – Addition and Multiplication. This table from Wolfram summarizes it perfectly:

 name addition multiplication associativity commutativity distributivity identity inverses

## Finite group generated by two elements of order 2 is isomorphic to Dihedral Group

Suppose $G=\langle s,t\rangle$ where both $s$ and $t$ has order 2. Prove that $G$ is isomorphic to $D_{2m}$ for some integer $m$.

Note that $G=\langle st, t\rangle$ since $(st)t=s$. Since $G$ is finite, $st$ has a finite order, say $m$, so that $(st)^m=1_G$. We also have $[(st)t]^2=s^2=1$.

We claim that there are no other relations, other than $(st)^m=t^2=[(st)t]^2=1$.

Suppose to the contrary $sts=1$. Then $sstss=ss$, i.e. $t=1$, a contradiction. Similarly if $ststs=1$, $tsststsst=tsst$ implies $s=1$, a contradiction. Inductively, $(st)^ks\neq 1$ and $(ts)^kt\neq 1$ for any $k\geq 1$.

Thus $\displaystyle G\cong D_{2m}=\langle a,b|a^m=b^2=(ab)^2=1\rangle.$

## Three Properties of Galois Correspondence

The Fundamental Theorem of Galois Theory states that:

Given a field extension $E/F$ that is finite and Galois, there is a one-to-one correspondence between its intermediate fields and subgroups of its Galois group.
1) $H\leftrightarrow E^H$ where $H\leq\text{Gal}(E/F)$ and $E^H$ is the corresponding fixed field (the set of those elements in $E$ which are fixed by every automorphism in $H$).
2) $K\leftrightarrow\text{Aut}(E/K)$ where $K$ is an intermediate field of $E/F$ and $\text{Aut}(E/K)$ is the set of those automorphisms in $\text{Gal}(E/F)$ which fix every element of $K$.

This correspondence is a one-to-one correspondence if and only if $E/F$ is a Galois extension.

## Three Properties of the Galois Correspondence

1. It is inclusing-reversing. The inclusion of subgroups $H_1\subseteq H_2$ holds iff the inclusion of fields $E^{H_2}\subseteq E^{H_1}$ holds.
2. If $H$ is a subgroup of $\text{Gal}(E/F)$, then $|H|=[E:E^H]$ and $|\text{Gal}(E/F)/H|=[E^H:F]$.
3. The field $E^H$ is a normal extension of $F$ (or equivalently, Galois extension, since any subextension of a separable extension is separable) iff $H$ is a normal subgroup of $\text{Gal}(E/F)$.

## Characterization of Galois Extensions

For a finite extension $E/F$, each of the following statements is equivalent to the statement that $E/F$ is Galois:

1) $E/F$ is a normal extension and a separable extension.
2) Every irreducible polynomial in $F[x]$ with at least one root in $E$ splits over $E$ and is separable.
3) $E$ is a splitting field of a separable polynomial with coefficients in $F$.
4) $|\text{Aut}(E/F)|=[E:F]$, that is, the number of automorphisms equals the degree of the extension.
5) $F$ is the fixed field of $\text{Aut}(E/F)$.

## Fundamental Theorem of Galois Theory

Given a field extension $E/F$ that is finite and Galois, there is a one-to-one correspondence between its intermediate fields and subgroups of its Galois group.
$H\leftrightarrow E^H$

where $H\leq\text{Gal}(E/F)$ and $E^H$ is the corresponding fixed field (the set of those elements in $E$ which are fixed by every automorphism in $H$).
$K\leftrightarrow\text{Aut}(E/K)$

where $K$ is an intermediate field of $E/F$ and $\text{Aut}(E/K)$ is the set of those automorphisms in $\text{Gal}(E/F)$ which fix every element of $K$.

This correspondence is a one-to-one correspondence if and only if $E/F$ is a Galois extension.

Examples
1) $E\leftrightarrow\{\text{id}_E\}$, the trivial subgroup of $\text{Gal}(E/F)$.
2) $F\leftrightarrow\text{Gal}(E/F)$.

## Proof:

Let $R$ be a PID. Let $(x)$ be a nonzero prime ideal. Suppose $(x)\subsetneq (y)$. Then $x=yr$ for some $r\in R$.

Note that $yr\in(x)$ implies $y\in(x)$ or $r\in(x)$. Since $(x)\neq(y)$, we have $r\in(x)$, so $r=xz$ for some $z\in R$. Then, $\displaystyle x=yr=yxz$ which implies $1=yz$, thus $y$ is a unit. Hence $(y)=R$.

## Class Equation of a Group

The class equation of a group is something that looks difficult at first sight, but is actually very straightforward once you understand it. An amazing equation…

## Class Equation of a Group (Proof)

Suppose $G$ is a finite group, $Z(G)$ is the center of $G$, and $c_1, c_2, \dots, c_r$ are all the conjugacy classes in $G$ comprising the elements outside the center. Let $g_i$ be an element in $c_i$ for each $1\leq i\leq r$. Then we have: $\displaystyle |G|=|Z(G)|+\sum_{i=1}^r[G:C_G(g_i)].$

## Proof:

Let $G$ act on itself by conjugation. The orbits of $G$ partition $G$. Note that each conjugacy class $c_i$ is actually $\text{Orb}(g_i)$.

Let $x\in Z(G)$. Then $gxg^{-1}=xgg^{-1}=x$ for all $g\in G$. Hence $\text{Orb}(x)$ consists of a single element $x$ itself.

Let $g_i\in c_i$. Then
\begin{aligned} \text{Stab}(g_i)&=\{h\in G\mid hg_ih^{-1}=g_i\}\\ &=\{h\in G\mid hg_i=g_ih\}\\ &=C_G(g_i). \end{aligned}
By Orbit-Stabilizer Theorem, $\displaystyle |\text{Orb}(g_i)|=[G:\text{Stab}(g_i)]=[G:C_G(g_i)].$

Therefore, $\displaystyle |G|=|Z(G)|+\sum_{i=1}^r[G:C_G(g_i)].$

# Orbit-Stabilizer Theorem

Let $G$ be a group which acts on a finite set $X$. Then $\displaystyle |\text{Orb}(x)|=[G:\text{Stab}(x)]=\frac{|G|}{|\text{Stab}(x)|}.$

## Proof

Define $\phi:G/\text{Stab}(x)\to\text{Orb}(x)$ by $\displaystyle \phi(g\text{Stab}(x))=g\cdot x.$

Well-defined:

Note that $\text{Stab}(x)$ is a subgroup of $G$. If $g\text{Stab}(x)=h\text{Stab}(x)$, then $g^{-1}h\in\text{Stab}(x)$. Thus $g^{-1}hx=x$, which implies $hx=gx$, thus $\phi$ is well-defined.

Surjective:

$\phi$ is clearly surjective.

Injective:

If $\phi(g\text{Stab}(x))=\phi(h\text{Stab}(x))$, then $gx=hx$. Thus $g^{-1}hx=x$, so $g^{-1}h\in\text{Stab}(x)$. Thus $g\text{Stab}(x)=h\text{Stab}(x)$.

By Lagrange’s Theorem, $\displaystyle \frac{|G|}{|\text{Stab}(x)|}=|G/\text{Stab}(x)|=|\text{Orb}(x)|.$

Field Medallist Prof. Gowers has also written a nice post on the Orbit -Stabilizer Theorem and various proofs.

## Necessary and Sufficient Conditions for Semidirect Product to be Abelian (Proof)

This theorem is pretty basic, but it is useful to construct non-abelian groups. Basically, once you have either group to be non-abelian, or the homomorphism to be trivial, the end result is non-abelian!

Theorem: The semidirect product $N\rtimes_\varphi H$ is abelian iff $N$, $H$ are both abelian and $\varphi: H\to\text{Aut}(N)$ is trivial.

Proof:
$(\implies)$

Assume $N\rtimes_\varphi H$ is abelian. Then for any $n_1, n_2\in N$, $h_1, h_2\in H$, we have
\begin{aligned} (n_1, h_1)\cdot(n_2,h_2)&=(n_2,h_2)\cdot(n_1, h_1)\\ (n_1\varphi_{h_1}(n_2), h_1h_2)&=(n_2\varphi_{h_2}(n_1), h_2h_1). \end{aligned}
This implies $h_1h_2=h_2h_1$, thus $H$ is abelian.

Consider the case $n_1=n_1=n$. Then for any $n\in N$, $n\varphi_{h_1}(n)=n\varphi_{h_2}(n)$. Multiplying by $n^{-1}$ on the left gives $\varphi_{h_1}(n)=\varphi_{h_2}(n)$ for any $h_1, h_2\in H$. Thus $\varphi_h(n)=\varphi_e(n)=n$ for all $h\in H$ so $\varphi$ is trivial.

Consider the case where $h_1=h_2=e$. Then we have $n_1n_2=n_2n_1$, so $N$ has to be abelian.

($\impliedby$)

This direction is clear.

## Outer Semidirect Product

Given any two groups $N$ and $H$ and a group homomorphism $\phi:H\to\text{Aut}(N)$, we can construct a new group $N\rtimes_\phi H$, called the (outer) semidirect product of $N$ and $H$ with respect to $\phi$, defined as follows.
(i) The underlying set is the Cartesian product $N\times H$.
(ii) The operation, $\bullet$, is determined by the homomorphism $\phi$:

$\bullet: (N\rtimes_\phi H)\times (N\rtimes_\phi H)\to N\rtimes_\phi H$

$(n_1,h_1)\cdot(n_2,h_2)=(n_1\phi_{h_1}(n_2),h_1h_2)$

for $n_1,n_2\in N$ and $h_1,h_2\in H$.

This defines a group in which the identity element is $(e_N, e_H)$ and the inverse of the element $(n,h)$ is $(\phi_{h^{-1}}(n^{-1}), h^{-1})$.
Pairs $(n, e_H)$ form a normal subgroup isomorphic to $N$, while pairs $(e_N, h)$ form a subgroup isomorphic to $H$.

## Inner Semidirect Product (Definition)

Given a group $G$ with identity element $e$, a subgroup $H$, and a normal subgroup $N\lhd G$; then the following statements are equivalent:

(i) $G$ is the product of subgroups, $G=NH$, where the subgroups have trivial intersection, $N\cap H=\{e\}$.
(ii) For every $g\in G$, there are unique $n\in N$ and $h\in H$, such that $g=nh$.

If these statements hold, we define $G$ to be the semidirect product of $N$ and $H$, written $G=N\rtimes H$.

## Inner Semidirect Product Implies Outer Semidirect Product

Suppose we have a group $G$ with $N\lhd G$, $H\leq G$ and every element $g\in G$ can be written uniquely as $g=nh$ where $n\in N$, $h\in H$.

Define $\phi: H\to\text{Aut}(N)$ as the homomorphism given by $\phi(h)=\phi_h$, where $\phi_h(n)=hnh^{-1}$ for all $n\in N$, $h\in H$.

Then $G$ is isomorphic to the semidirect product $N\rtimes_{\phi}H$, and applying the isomorphism to the product, $nh$, gives the tuple, $(n,h)$. In $G$, we have
$\displaystyle (n_1h_1)(n_2h_2)=n_1h_1n_2(h_1^{-1}h_1)h_2=(n_1\phi_{h_1}(n_2))(h_1h_2)=(n_1,h_1)\cdot(n_2,h_2)$
which shows that the above map is indeed an isomorphism.

# Sylow Theorems

Let $G$ be a finite group.

## Theorem 1

For every prime factor $p$ with multiplicity $n$ of the order of $G$, there exists a Sylow $p$-subgroup of $G$, of order $p^n$.

## Theorem 2

All Sylow $p$-subgroups of $G$ are conjugate to each other, i.e.\ if $H$ and $K$ are Sylow $p$-subgroups of $G$, then there exists an element $g\in G$ with $g^{-1}Hg=K$.

## Theorem 3

Let $p$ be a prime such that $|G|=p^nm$, where $p\nmid m$. Let $n_p$ be the number of Sylow $p$-subgroups of $G$. Then:
1) $n_p\mid m$, which is the index of the Sylow $p$-subgroup in $G$.
2) $n_p\equiv 1\pmod p$.

## Theorem 3b (Proof)

We have $n_p=[G:N_G(P)]$, where $P$ is any Sylow $p$-subgroup of $G$ and $N_G$ denotes the normalizer.

### Proof

Let $P$ be a Sylow $p$-subgroup of $G$ and let $G$ act on $\text{Syl}_p(G)$ by conjugation. We have $|\text{Orb}(P)|=n_p$, $\text{Stab}(P)=\{g\in G:gPg^{-1}=P\}=N_G(P)$.

By the Orbit-Stabilizer Theorem, $|\text{Orb}(P)|=[G:\text{Stab}(P)]$, thus $n_p=[G:N_G(P)]$.

## Orbit-Stabilizer Theorem

Let $G$ be a group which acts on a finite set $X$. Then $\displaystyle |\text{Orb}(x)|=[G:\text{Stab}(x)]=\frac{|G|}{|\text{Stab}(x)|}.$

# Fundamental Theorem of Finitely Generated Abelian Groups

## Primary decomposition

Every finitely generated abelian group $G$ is isomorphic to a group of the form $\displaystyle \mathbb{Z}^n\oplus\mathbb{Z}_{q_1}\oplus\dots\oplus\mathbb{Z}_{q_t}$ where $n\geq 0$ and $q_1,\dots,q_t$ are powers of (not necessarily distinct) prime numbers. The values of $n, q_1, \dots, q_t$ are (up to rearrangement) uniquely determined by $G$.

## Invariant factor decomposition

We can also write $G$ as a direct sum of the form $\displaystyle \mathbb{Z}^n\oplus\mathbb{Z}_{k_1}\oplus\dots\oplus\mathbb{Z}_{k_u},$ where $k_1\mid k_2\mid k_3\mid\dots\mid k_u$. Again the rank $n$ and the invariant factors $k_1,\dots,k_u$ are uniquely determined by $G$.

## Galois Group of Polynomial

Separable Polynomial
A polynomial over $F$ is said to be separable if it has no multiple roots (i.e., all its roots are distinct).

Galois Group of Polynomial
Let $f(x)$ be a separable polynomial over $F$. Let $K$ be the splitting field over $F$ of $f(x)$. Then the Galois group of $f(x)$ over $F$ is defined to be $\text{Gal}(K/F)$.

## Eisenstein’s Criterion

Let $f(x)=a_nx^n+a_{n-1}x^{n-1}+\dots+a_1x+a_0$ be a polynomial in $\mathbb{Z}[x]$. If there exists a prime $p$ such that:

(i) $p\mid a_i$ for $i\neq n$,
(ii) $p\nmid a_n$, and
(iii) $p^2\nmid a_0$

then $f$ is irreducible over $\mathbb{Q}$.

One way to remember Eisenstein’s Criterion is to remember this classic application to show the irreducibility of the cyclotomic polynomials (after substituting $x+1$ for $x$):

$\displaystyle\frac{(x+1)^p-1}{x}=x^{p-1}+{p\choose{p-1}}x^{p-2}+\dots+{p\choose 2}x+{p\choose 1}$.

## Finite extension is Algebraic extension (Proof) + “Converse”

These two are useful lemmas in Galois/Field Theory.

Finite extension is Algebraic extension (Proof)
Let $L/K$ be a finite field extension. Then $L/K$ is an algebraic extension.
Proof:
Let $L/K$ be a finite extension, where $[L:K]=n$. Let $\alpha\in L$. Consider $\{1,\alpha,\alpha^2,\dots,\alpha^n\}$ which has to be linearly dependent over $K$ since there are $n+1$ elements. Thus, there exists $c_i\in K$ (not all zero) such that $\sum_{i=0}^n c_i\alpha^i=0$, so $\alpha$ is algebraic over $K$.

Finitely Generated Algebraic Extension is Finite (Proof)
Let $L/K$ be a finitely generated algebraic extension. Then $L/K$ is a finite extension.

Proof:
Since $L/K$ is finitely generated, $L=K(\alpha_1,\dots,\alpha_n)$ for some $\alpha_1,\dots,\alpha_n\in K$. Since $L/K$ is algebraic, each $\alpha_i$ is algebraic over $K$. Denote $L_i:=K(\alpha_1,\dots,\alpha_i)$ for $1\leq i\leq n$. Then $L_i=L_{i-1}(\alpha_i)$ for each $i$. Since $\alpha_i$ is algebraic over $K$, it is also algebraic over $L_{i-1}$, so there exists a polynomial $g_i$ with coefficients in $L_{i-1}$ such that $g_i(\alpha_i)=0$. Thus $[L_i:L_{i-1}]\leq\deg g_i<\infty$. Similarly $[L_1:K]<\infty$. By Tower Law, $[L:K]=[L_n:L_{n-1}][L_{n-1}:L_{n-2}]\dots[L_1:K]<\infty$.

## Subgroup Isomorphic but Quotient Group Not Isomorphic

The following is a slightly shocking counterexample for beginning students of Group Theory: If $G$ is a group, and $H\cong K$ are normal subgroups of $G$, it may be possible that $G/H\not\cong G/K$!

Counter-example: Take $G=\mathbb{Z}/4\times\mathbb{Z}/2$, $H=\langle (0,1)\rangle$, $K=\langle (2,0)\rangle$.

Note that $H\cong K\cong Z/2$, but $G/H=\{(0,0), (1,0), (2,0), (3,0)\}\cong Z/4$ while $G/K=\{(0,0), (0,1), (1,0), (1,1)\}\cong Z/2\times Z/2$!

(By $\mathbb{Z}/n$ we mean $\mathbb{Z}/n\mathbb{Z}$.)

## Balanced Product

For a ring $R$, a right $R$-module $M$, a left $R$-module $N$, and an abelian group $G$, a map $\phi:M\times N\to G$ is said to be $R$-balanced, if for all $m,m'\in M$, $n,n'\in N$, and $r\in R$ the following hold:

\begin{aligned} \phi(m,n+n')&=\phi(m,n)+\phi(m,n')\\ \phi(m+m',n)&=\phi(m,n)+\phi(m',n)\\ \phi(m\cdot r,n)&=\phi(m,r\cdot n) \end{aligned}

The first two axioms are essentially bilinearity, while the third is something like associativity.

## Endomorphism ring of Q is a division algebra

We show that $Q$ is not semisimple nor simple, but $\text{End}_\mathbb{Z}(\mathbb{Q})$ is a division algebra.

Consider $A=\mathbb{Z}$ (as a $\mathbb{Z}$-algebra). Consider $M=\mathbb{Q}$ as a right $\mathbb{Z}$-module.
Lemma:
$\mathbb{Q}$ is not semisimple nor simple.

Suppose to the contrary $\mathbb{Q}=\bigoplus_{i\in I}N_i$, where $N_i$ are simple $\mathbb{Z}$-modules (i.e. $N_i\cong\mathbb{Z}/p_i\mathbb{Z}$). Then there exists nonzero $x\in\mathbb{Q}$ such that $x$ has finite order (product of primes). This is impossible in $\mathbb{Q}$.
Lemma:
$\text{End}_\mathbb{Z}(\mathbb{Q})\cong\mathbb{Q}$ as $\mathbb{Z}$-algebras.

Define $\Psi:\mathbb{Q}\to\text{End}_\mathbb{Z}(\mathbb{Q})$ where $q\in\mathbb{Q}$ is mapped to $\lambda_q\in\text{End}_\mathbb{Z}(\mathbb{Q})$, where $\lambda_q(x)=qx$. Let $k\in\mathbb{Z}$, $q,q_1,q_2\in\mathbb{Q}$.

We can check that $\Psi$ is a $\mathbb{Z}$-algebra homomorphism.

Let $q\in\ker\Psi$. Then $\Psi(q)=\lambda_q=0$, $\lambda_q(x)=qx=0$ for all $x\in\mathbb{Q}$. This implies $q=q\cdot 1=0$. Hence $\Psi$ is injective.

Let $\phi\in\text{End}_\mathbb{Z}(\mathbb{Q})$. Let $x=\frac{a}{b}\in\mathbb{Q}$, where $a,b\in\mathbb{Z}$. $\phi(x)=a\phi(\frac 1b)=\frac ab\cdot b\phi(\frac 1b)=\frac ab\cdot\phi(1)=\phi(1)\cdot x=\lambda_{\phi(1)}(x)$. Hence $\Psi$ is surjective.

Thus $\text{End}_\mathbb{Z}(\mathbb{Q})\cong\mathbb{Q}$ is a division algebra, but $\mathbb{Q}$ is not simple.

## Irreducible representations

Let $\rho:G\to \text{GL}(V)$ be a linear representation of $G$. We say that it is irreducible or simple if $V$ is not 0 and if no vector subspace of $V$ is stable under $G$, except of course 0 and $V$. This is equivalent to saying $V$ is not the direct sum of two representations, except for the trivial decomposition $V=0\oplus V$.

## Idea for making a map bijective

A technique in algebra to make a homomorphism injective is to “mod out” the kernel.

While, to make a homomorphism surjective, one can restrict the codomain to the image.

This can be illustrated in the first isomorphism theorem (for groups) $G/\ker\phi\cong\text{Im}\ \phi$.

## Regular Representation of G

Let $g$ be the order of $G$, and let $V$ be a vector space of dimension $g$, with a basis $(e_t)_{t\in G}$ indexed by the elements $t$ of $G$. For $s\in G$, let $\rho_s$ be the linear map of $V$ into $V$ which sends $e_t$ to $e_{st}$; this defines a linear representation, which is called the regular representation of $G$.

## Artin-Whaples Theorem

There seems to be another version of Artin-Whaples Theorem, called the Artin-Whaples Approximation theorem.

The theorem stated here is Artin-Whaples Theorem for central simple algebras.

Artin-Whaples Theorem: Let $A$ be a central simple algebra over a field $F$. Let $a_1,\dots,a_n\in A$ be linearly independent over $F$ and let $b_1,\dots,b_n$ be any elements in $A$. Then there exists $a_i',a_i''\in A$ for $i=1,\dots,m$ such that the $F$-linear map $f:A\to A$ defined by $f(x)=\sum_{r=1}^m a_r'xa_r''$ satisfies $f(a_j)=b_j$ for all $j=1,\dots,n$.

Very nice and useful theorem.

## Simple Algebra does not imply Semisimple Algebra

The terminology “semisimple” algebra suggests a generalization of simple algebras, but in fact not all simple algebras are semisimple! (Exercises 1 & 5 in Richard Pierce’s book contain examples)

A simple module is a semisimple module is true though.

Proposition: For a simple algebra A, the following conditions are equivalent:

(i) A is semisimple;

(ii) A is right Artinian;

(iii) A has a minimal right ideal.

Thus to find a algebra that is simple but not semisimple, one can look for an example that is not right Artinian.

## Direct Sum vs Cartesian Product

Excellent explanation found on Math Stackexchange.

Basically for finite index sets (finite number of factors), the two constructions are the same.

Only when there is an infinite number of factors, the direct sum $\bigoplus_{i\in I}G_i$ is the subgroup of the Cartesian product consisting of all tuples $\{g_i\}$ where there are only finitely many $g_i$ that are nonzero.

## If x^2 is in F, x not in F, then x is a Pure Quaternion

Proposition: If $x^2\in Z(A)=F$ and $x\notin F$, then $x$ is a pure quaternion.

Proof: Let $x=c+z$, with $c\in F$ and $z\in A_+$ ($z$ is a pure quaternion).

Then $x^2=c^2+2cz+z^2=c^2-\nu(z)+2cz$. The key observation is that if $z=c_1i+c_2j+c_3k$, then $z^2=-\nu(z)=ac_1^2+bc_2^2-abc_3^2$.

Since $x\in F$, this means that $2cz=0$, i.e. $c=0$, so $x$ is a pure quaternion.

## Wedderburn’s Structure Theorem

In abstract algebra, the Artin–Wedderburn theorem is a classification theorem for semisimple rings and semisimple algebras. The theorem states that an (Artinian) [1] semisimple ring R is isomorphic to a product of finitely many ni-by-ni matrix rings over division rings Di, for some integers ni, both of which are uniquely determined up to permutation of the index i. In particular, any simple left or right Artinian ring is isomorphic to an n-by-nmatrix ring over a division ring D, where both n and D are uniquely determined. (Wikipedia)

This is quite a powerful theorem, as it allows semisimple rings/algebras to be “represented” by a finite direct sum of matrix rings over division rings. This is in the spirit of Representation theory, which tries to convert algebraic objects into objects in linear algebra, which is relatively well understood.

## Wedderburn’s Structure Theorem

Let $A$ be a semisimple $R$-algebra.

(i) $A\cong M_{n_1}(D_1)\oplus\dots\oplus M_{n_r}(D_r)$ for some natural numbers $n_1,\dots,n_r$ and $R$-division algebras $D_1,\dots,D_r$.

(ii) The pair $(n_i,D_i)$ is unique up to isomorphism and order of arrangement.

(iii) Conversely, suppose $A=M_{n_1}(D_1)\oplus\dots\oplus M_{n_r}(D_r)$, then $A$ is a right (and left) semisimple $R$-algebra.

## Some Notes

The definition of a semisimple $R$-algebra is: An $R$-algebra $A$ is semisimple if $A$ is semisimple as a right $A$-module.

Example: $R=\mathbb{R}$ and $A=M_2(\mathbb{R})\oplus M_2(\mathbb{R})$. Then $A$ is semisimple by Wedderburn’s theorem.

## Semisimple Modules Equivalent Conditions

Proposition:  For a right $A$-module $M$, the following are equivalent:

(i) $M$ is semisimple.

(ii) $M=\sum\{N\in S(M): N\ \text{is simple}\}$.

(iii) $S(M)$ is a complemented lattice, that is, every submodule of $M$ has a complement in $S(M)$.

We are following Pierce’s book’s Associative Algebras (Graduate Texts in Mathematics) notation, where $S(M)$ is the set of all submodules of $M$.

This proposition is can be used to prove two useful Corollaries:

Corollary 1) If $M$ is semisimple and $P$ is a submodule of $M$, then both $P$ and $M/P$ are semisimple. In words, this means that submodules and quotients of a semisimple module is again semisimple.

Proof: Since $M$ is semisimple, by (iii) we have $M\cong P\oplus P'$, where $P'\cong M/P$. Let $N$ be a submodule of $P\leq M$. Then $N$ has a complement $N'$ in $S(M)$: $M=N\oplus N'$.

$P=P\cap(N\oplus N')=N\oplus (N'\cap P)$ with $N'\cap P\in S(P)$. Thus, we have that $P$ is semisimple by condition (iii). Similarly, $M/P\cong P'$ is semisimple.

Corollary 2) A direct sum of semisimple modules is semisimple.

Proof: This is quite clear from the definition of semisimple modules being direct sum of simple modules. A direct sum of (direct sum of simple modules) is again a direct sum of simple modules.