# Zaiku Quantum Formalism
## Lecture 1 - Naive Set Theory | 18-09-20
**ZFC**
- Zermelo Frankel Set Theory - axiomatic system to formulate a theory of sets. Modern mathematics is built on ZFC.
- Is free of paradoxes like [Russell's Paradox](https://en.wikipedia.org/wiki/Russell%27s_paradox).
The empty set $\emptyset$ - has no elements
- Its existence is guaranteed by ZFC
- Is a proper subset of all sets except itself (_No set is a proper subset of itself_).
**Subsets**
- $A \subseteq X$ if every element in $A$ is in $X$.
- Proper subset: $A \subset X$ (i.e. $A \neq X$)
**Disjoint Sets**: $X \cap Y = \emptyset$ - i.e. no elements in common.
**Maps**
$f \colon X \mapsto Y$ is a map between $X$ and $Y$.
- *Domain* of $f$ = $D_f$ = all covered elements of $X$ under $f$.
- *Image* of $f$ = $Im_f = \{f(\psi) \:\:\vert\:\: \psi \in D_f \}$
- *Injective*: $f(\psi_1) = f(\psi_2) \implies \psi_1 = \psi_2$
- Each element of the codomain is mapped to by *at-most* one element of the domain
- *Surjective*: $Im_f = Y$
- Each element of the codomain is mapped to by *at-least* one element of the domain
- *Bijective*: Both of the above
- Each element of the codomain is mapped to by *exactly* one element of the domain
**Isomorphism**
$X$ is (set)-isomorhpic to $Y$, i.e. $X \simeq Y$ if there is a bijection between the 2 sets.
$X \simeq Y, Y \simeq Z \implies X \simeq Z$
**Cardinality**
A non-empty set $X$ is finite if there exists $k \geq 1$ such that $X \simeq \mathbb{N}_k = \{1\dots k\}$.
$k$ is called the *cardinality* of $X$.
$X$ is *infinite* if it contains a *proper* subset $\Delta$ which is isomorphic to $X$
**Countability**
$X$ is countably infinite if $X \simeq \mathbb{N}$. If $X$ is infinite and not isomorphic to $\mathbb{N}$, we say it is uncountably infinite or just uncountable.
If $X$ is countably infinite, its cardinality $\lvert X\rvert = \aleph_0$ (aleph-null).
The cardinality of the set of real numbers $\mathbb{R}$ is $\mathfrak{c}$ (called the cardinality of the continuum).
- [[**continuum hypothesis**]] - there is no set with cardinality lying between $\aleph_0$ and $\mathfrak{c}$. Open question.
- Set of rational numbers $\mathbb{Q}$ is *countable*
- $g \colon \mathbb{Z}\times \mathbb{N} \mapsto \mathbb{Y} \colon g(m,n) = \frac{m}{n+1}$ is a surjective function from the countable set $\mathbb{Z}\times \mathbb{N}$ to $\mathbb{Q}$.
**Power Set**
Set of all subsets of a set. Represented by $P(X)$.
If $X$ is finite, $\lvert P(X)\rvert = 2^{\lvert X\rvert}$.
Interesting: $\vert P(N)\rvert = \lvert \mathbb{R}\rvert$.
**Probability Measure**
See [lecture notes](https://github.com/quantumformalism/2020-math-lectures/blob/master/foundation-module/lecture-01/lecture-01-slides.pdf).
---
## Lecture 2 - Group Theory 101 | 25-09-2020
### **Binary Operations**
- Takes 2 elements of a non-empty set and generates a third element of the same set.
- Formally - it is a map $X \times X \to X$ where $X \times X$ is the cartesian product of $X$ with itself.
- **Closure**
- $\forall \psi_1, \psi_2 \in X, \psi_1 * \psi_2 \in X$
- e.g. $\mathbb{N}$ is closed under addition and multiplication.
### **Groups**
A group is a pair consisting of a non-empty set and a binary operation closed on the set, i.e. $(G, *)$. The following axioms must be satisfied:
1. Existence of the identity element $e$.
$$e * \psi = \psi * e = e,\;\; \forall \psi \in G$$
2. Associativity
$$\psi_1 * (\psi_2 * \psi_3) = (\psi_1 * \psi_2) * \psi_3$$
3. Existence of an inverse
$$\forall \psi \in G, \ \exists \tilde{\psi} \in G \;\mid\;\psi * \tilde{\psi} = \tilde{\psi}*\psi = e$$
If $(G, *)$ is a group, then:
- the identity element is unique
- the inverse of each element is unique
**Abelian Group**
$(G, *)$ is an abelian (commutative) group if $\forall\:\psi_1, \psi_2 \in G$:
$$\;\psi_1*\psi_2 = \psi_2*\psi_1$$
**Additive Group**
Is a group of which the group operation is thought of as addition (in some sense).\
Usually Abelian, and denoted by $\left(G, +\right)$.
The identity element is denoted by $0$, and the inverse of $g$ by $-g$.
**Finite Group**
A group is finite if the underlying set is finite.
The cardinality of $G$, i.e. $\lvert G\rvert$, is called the order of $G$.
**Subgroups**
If $(G, *)$ is a group, and $H \subseteq G$, $H$ is a subgroup of $G$ if $H$ is also under a group under $*$.
- $G$ and $\{e\}$ are trivial subgroups of $G$.
**2x1 Complex Additive Matrix Group**
Denoted by $\mathbb{C}^2$, where
$$
\mathbb{C}^2 = \Bigg\{
\begin{pmatrix}
\alpha \\
\beta
\end{pmatrix}
\Big \vert\;
\alpha, \beta \in \mathbb{C}
\Bigg\}
$$
$+$ is defined as follows:
If $\psi_1 = \begin{pmatrix}\alpha_1\\\beta_1\end{pmatrix}$ and $\psi_2 = \begin{pmatrix}\alpha_2\\\beta_2\end{pmatrix}$, $\psi_1 + \psi_2 = \begin{pmatrix}\alpha_1+\alpha_2\\\ \beta_1+\beta_2\end{pmatrix}$
The identity element is $0 = \begin{pmatrix}0\\0\end{pmatrix}$ and the inverse of $\psi = \begin{pmatrix}\alpha\\\beta\end{pmatrix}$ is $\tilde{\psi} = \begin{pmatrix}-\alpha\\-\beta\end{pmatrix}$
**2x2 Complex Matrices**
$M_2(\mathbb{C}) = \Bigg\{
\begin{pmatrix}
a&b \\
c&d
\end{pmatrix}
\Big \vert\;
a,b,c,d \in \mathbb{C}
\Bigg\}$
Multiplication ($\times$) is defined as the usual matrix multiplication, but is generally *not commutative*.
**General Linear Group of 2x2 Matrices**
Denoted by $GL_2(\mathbb{C})$ or $GL(2, \mathbb{C})$.
Is described by the set of all $2\times 2$ *invertible* complex matrices and the multiplication operator in $M_2(\mathbb{C})$.
$$
GL_2(\mathbb{C}) = \Bigg\{
\begin{pmatrix}
a&b\\
c&d
\end{pmatrix}
\in M_2(\mathbb{C}) \;
\Big \vert\;
ad - bc \neq 0
\Bigg\}
\textrm{under multiplication in $M_2(\mathbb{C})$}
$$
- Identity element = $I_2$ = $\begin{pmatrix}1&0\\0&1\end{pmatrix}$
- Inverse of $\begin{pmatrix}a&b\\c&d\end{pmatrix}$ = $\frac{1}{ad-bc} \begin{pmatrix}d&-b\\-c&a\end{pmatrix}$
- Is a **non-abelian** group.
- Contains some involutory matrices (i.e. $A^{-1} = A$)
- The Pauli matrices $X, Y, Z$ fall in this category.
- They're part of the unitary group $U(2)$, a subgroup of $GL_2(\mathbb{C})$.
**$GL_2(\mathbb{C})$ Left-Action**
If $X$ is a non-empty set, a left-action of $GL_2(\mathbb{C})$ onto $X$ is a prescription "$\cdot$" that takes an element $A \in GL_2(\mathbb{C})$ and $\psi \in X$ and produces $A \cdot \psi \in X$.
- $A \cdot (B \cdot \psi) = (AB) \cdot \psi \;|\; A,B \in GL_2(\mathbb{C})$ and $\forall \psi \in X$.
- $I \cdot \psi = \psi, \;\;\forall \psi \in X$.
Example: $GL_2(\mathbb{C})$ left acting on $\mathbb{C}^2$:
$$
\begin{pmatrix}
a&b\\c&d
\end{pmatrix}
\cdot
\begin{pmatrix}
\alpha\\\beta
\end{pmatrix} =
\begin{pmatrix}
a\alpha+b\beta\\c\alpha+d\beta
\end{pmatrix}
$$
**$GL_2(\mathbb{C})$ Group Commutator**
If $A,B \in GL_2(\mathbb{C})$, the commutator b/w $A$ and $B$ is defined as
$$\left[A, B\right] = A^{-1}B^{-1}AB$$
- $\left[A, B\right] = I$ only if $A$ and $B$ commute
- $\left[A, B\right] = \left[B, A\right]^{-1}$
- $\left[A, B, C\right] = \left[A, C\right]\left[A, B\right]\left[\left[A, B\right], C\right]$
---
## Lecture 3 - Rings and Fields 101 | 02-10-2020
### **Rings**
A ring is a triple $\left(R, +, \times\right)$, where $R$ is a non-empty set, and $+$ and $\times$ are closed binary operations on R satisfying:
- $\left(R, +\right)$ is an additive group.
- Associativity of multiplication: $A\times\left(B\times C\right) = \left(A\times B\right)\times C$
- Not strictly required. If not satisfied, it is called a non-associative group.
- Distributivity of multiplication over addition: $A\times\left(B + C\right) = \left(A \times B\right) + \left(A \times C \right)$.
**Commutative / Abelian Ring**
$AB = BA, \:\: \forall A, B \in R$
Note: $A\times B$ is just written as $AB$.
**Ring with multiplicative identity**
A ring $\left(R, +, \times\right)$ containing an element $1$ in $R$ such that $1A = A1 = A, \:\: \forall A \in R$.
**Identities**
- $A \times 0 = 0, \:\: \forall A \in R$
- $A \times -B = -A \times B, \:\: \forall A,B \in R$
**Finite Ring**
If the underlying set is finite.
**Characteristic of a Ring**
$Char(R) = n$ where $n$ is the smallest positive integer such that $n \times 1 = 0$ where $n \times 1 = 1 + 1 + 1 \dots + 1$ ($n$ times).
**Subring**
If $R$ is a ring, $S \subseteq R$ and $S$ is also a ring under the same operations as $R$, then $S$ is a subring of $R$.
**Integral Domain**
Let $R$ be an abelian ring with multiplicative identity $1$. $R$ is an integral domain if $ab = 0 \implies a = 0 \:\:\text{or}\:\: b = 0$
**2x2 Complex Matrix Ring**
$M_2(\mathbb{C}) = \Bigg\{
\begin{pmatrix}
a&b \\
c&d
\end{pmatrix}
\Big \vert\;
a,b,c,d \in \mathbb{C}
\Bigg\}$
Multiplication ($\times$) is defined as the usual matrix multiplication.
Is a non-abelian ring.
The Pauli Matrices are 3 special elements of $M_2(\mathbb{C})$.
Is an example of $C^*$ algebra.
### **Fields**
A field is a triple $(F, + \times)$ satisfying
- $(F, +, \times)$ is an integral domain.
- Every non-zero element of $F$ has a multiplicative inverse ($aa^{-1} = a^{-1}a = 1$)
- $F_p = \{0, 1 \dots p-1\}$ is a field if $p$ is prime.
---
## Lecture 4 - Vector Spaces 101 | 09-10-2020
### **Complex Vector Space**
A complex vector space over $\mathbb{C}$ is a triple $(V, +, \cdot)$ where $(V, +)$ is an additive group and $\cdot$ is called scalar multiplication in $V$ such that:
- $\alpha \cdot \psi \in V, \:\: \psi \in V, \alpha \in \mathbb{C}$
- $\alpha \cdot (\psi_1 + \psi_2) = \alpha\cdot\psi_1 + \alpha\cdot\psi_2$
- $(\alpha + \beta)\cdot\psi = \alpha\cdot\psi + \beta\cdot\psi$
- $\alpha\cdot(\beta\cdot\psi) = \alpha\beta\cdot\psi$
- $1\cdot\psi = \psi$ where $1$ is the multiplicative identity in $\mathbb{C}$.
If $V$ is a vector space over $C$:
- Every $\psi$ has a unique additive inverse $-\psi$.
- $0\psi = 0_V$
- $\alpha0_V = 0_V$
- $-1\psi = -\psi$
e.g. $\mathbb{C}^2$ is a vector space over $\mathbb{C}$ (usual $+$ and $\cdot$)
### **Linear Subspace**
If $V$ is a vector space over $C$ and $L \subseteq V$, $L$ is a linear subspace of $V$ if $L$ is also a vector space over $C$ under the same scalar multiplication ($\cdot$).
- Necessary and sufficient conditions: addition and scalar multiplication should be closed in $L$.
- $L$ and $W$ are subspaces of $V$ $\implies$ $L \cap W$ is a subspace.
### **Linear Combination**
If $\psi_1, \dots ,\psi_n \in V$ then their linear combination is $\alpha_1\psi_1 + \dots +\alpha_n\psi_n$, where $\alpha_1 \dots \alpha_n \in \mathbb{C}$
**Span**
Span of $(\psi_1, \dots ,\psi_n)$ = Set of all possible linear combinations of the vectors $\psi_1 \dots \psi_n$.
The span is a linear subspace of $V$.
**Dirac Notation**
$\lvert\psi\rangle$ to represent vectors.
---
## Lecture 5 - Vector Spaces 101 - Part 2 | 16-10-2020
### **Linear Independence**
Vectors $\psi_1, \dots ,\psi_n$ are linearly independent if $\sum_{i=1}^n\alpha_i\psi_i = 0 \iff \alpha_1 = \alpha_2 = \dots = \alpha_n = 0$.
### **Basis**
A subset $B = \{\lvert e_1\rangle, \lvert e_2\rangle, \dots , \lvert e_n\rangle\}$ forms a (Hamel) basis if:
1. $\lvert e_1\rangle \dots \lvert e_n\rangle$ are linearly independent.
2. The elements of $B$ span $V$ i.e. any element in $V$ can be written as a linear combination of the elements of $B$.
The cardinality of a basis set defines the dimension of the vector space, i.e. $dim(V) = \lvert B \rvert$.
- All bases have same cardinality, so dimension does not depend on choice of basis.
- $dim(V_1) = dim(V_2) \iff V_1 \simeq V2$
### **Direct Sum of Subspaces**
If $L_1, L_2, \dots ,L_n$ are subspaces of $V$, their sum is defined by $L_1 + \dots + L_n = \{\lvert\psi_1\rangle + \dots + \lvert\psi_n\rangle\}, \:\: \psi_i \in L_i$.
**Internal Direct Sum**
If each $\psi \in L_1 + \dots + L_n$ can be uniquely written as $\sum_{i=1}^n \psi_i, \:\: \psi_i \in L_i$, we replace $+$ by $\oplus$ (i.e. $L_1 \oplus\dots\oplus L_n$) and call it the internal direct sum.
---
## Lecture 6 - Linear Operators 101 | 23-10-2020
### **Linear Operators**
If $V$ and $W$ are vector spaces over $\mathbb{C}$, a map $T: V \to W$ is a linear operator if it satisfies:
1. $T(\lvert\psi_1\rangle + \lvert\psi_2\rangle) = T\lvert\psi_1\rangle + T\lvert\psi_2\rangle, \:\: \forall \psi_1, \psi_2 \in V$
2. $T(\alpha\lvert\psi\rangle) = \alpha T\lvert\psi\rangle, \:\:\forall \psi \in V, \alpha \in \mathbb{C}$
If $T$ is a one-to-one operator, and $B = (e_1 \dots e_n)$ is a basis of $V$, then $b^\prime = \big(T\lvert e_1\rangle \dots T\lvert e_n\rangle\big)$ is a basis of $W$.
- $T\lvert 0_V\rangle = \lvert 0_W\rangle$
- Range of $T = ran(T) = \{T\lvert\psi\rangle \:\:\vert\:\: \psi \in V\}$
- Is a linear subspace of $W$.
- Dimension of the range is called the **_rank_**
### **Kernel**
Kernel of $T = \{\psi \in V \:\:\vert\:\: T\lvert\psi\rangle = 0_W\}$, a.k.a **_null space_**.
- Is a subspace of $V$.
- Dimension of the kernel is called the **_nullity_**.
### **Rank-Nullity Theorem**
$dim(V) = rank + nullity$
---
## Lecture 7 - Linear Operators 101 - Part 2 | 30-10-2020
### **Isomorphic Vector Spaces**
2 vector spaces are isomorphic if there exists at least 1 bijective mapping $T: V \to W$ between them, where a bijective mapping is:
1. Injective: $T\lvert\psi_1\rangle = T\lvert\psi_2\rangle \implies \psi_1 = \psi_2$
and
2. Surjective: $Range(T) = W$
Also, because of the injectivity condition, we can also say $T$ is bijective if $Ker(T) = 0_V$.
**A few theorems**
- $v \simeq W \iff dim(V) = dim(W)$
- If $V$ is a vector space over $\mathbb{C}$ and $dim(V) = n$, then $V \simeq \mathbb{C}^n$.
**Identity Operator**
$I\lvert\psi\rangle = \lvert\psi\rangle$
**Product of Operators**
If $T_1: V \to V$ and $T_2: V \to V$, then $T_2 \circ T_1: V \to V$ is defined as $T_2 \circ T_1 (\lvert\psi\rangle) = T_2T_1\lvert\psi\rangle$.
- Is also a linear operator
**Invertible Operators**
$T$ is invertible if there exists $T^{-1}$ such that $TT^{-1} = T^{-1}T = I_V$.
- $T$ is invertible $\iff Ker(T) = \{0\}$
- If $T_1$ and $T_2$ are invertible, $T_1T_2$ is invertible.
### **Abstract General Linear Group**
$GL(V) = \{T: V \to V \:\:\vert\:\: T$ is invertible$\}$
- if $dim(v) = n$, then $GL(V) \simeq GL(n, \mathbb{C})$
### **Eigen Vectors and Eigen Values**
Same definition as always: $T\lvert\psi\rangle = \lambda\psi$ etc etc.
Set of all eigenvalues of $T$ is called the **_Spectrum_** of $T$.
If $T$ has $n$ distinct eigenvalues $\lambda_1 \dots \lambda_n$, then the corresponding eigenvectors $\lvert\psi_1\rangle \dots \lvert\psi_n\rangle$ are linearly independent.
- If $dim(V) = n$, then the eigenvectors form a basis.
---
## Lecture 8 - Matrix Algebra 1 | 06-11-2020
### **Complex Matrices**
$A_{mn} = \begin{pmatrix}a_{11}\:\:a_{12} \dots a_{1n}\\\vdots\\ a_{m1}\:\:a_{m2}\dots a_{mn}\end{pmatrix}$, $a_{ij} \in \mathbb{C}$
Diagonal elements: $i=j$.
**Square Matrix**: $m=n$
Matrices in $M_n(\mathbb{C})$ can act as operators for $\mathbb{C}^n$ (defined according to usual rules of matrix multiplication).
**Addition in $M_n(\mathbb{C})$**
Add the corresponding elements of the matrices.
Properties:
1. $A + B = B + A$
2. $A + (B + C) = (A + B) + C$
**Scalar Multiplication in $M_n(\mathbb{C})$**
Multiply each element of the matrix by the scalar.
Since this scalar multiplication satisfies the scalar product properties of a vector space, so $M_n(\mathbb{C})$ is a vector space over $\mathbb{C}$ with dimension $n \times n$.
Behind the scenes, the density matrix formalism of quantum mechanics uses $M_n(\mathbb{C})$ as a $C^*-$algebra.
**Multiplication in $M_n(\mathbb{C})$**
Usual matrix multiplication.
Generally:
- $AB \neq BA$
- $A(BC) = (AB)C$
- $A(B+C) = AB + AC$
- $M_n(\mathbb{C})$ is a non-abelian ring with Identity element $I_n$.
### **Invertible Matrices**
$A \in M_n(\mathbb{C})$ is invertible if there exists $A^{-1} \in M_n(\mathbb{C})$ such that $AA^{-1} = A^{-1}A = I$.
If $A$ and $B$ are invertible,
- $AB$ is invertible
- $(AB)^{-1} = B^{-1}A^{-1}$
### **Diagonal Matrices**
All non-diagonal elements are $0$, i.e. $a_{ij} = 0$ if $i \neq j$.
If $A$ and $B$ are diagonal, $AB$ is diagonal and equal to $BA$.
A diagonal matrix is **invertible** if *all* the diagonal elements are non-zero. The inverse is found by taking the reciprocal of every diagonal element.
- Also called a *non-singular* matrix.
- Non-invertible matrices are called *singular*.
---
## Lecture 9 - Matrix Algebra 2 | 13-11-2020
### **Matrix Transpose**
If $A = [A_{ij}]$, then the transpose of $A$, denoted by $A^T = [A_{ji}]$ i.e. flip rows and columns.
**Properties**
1. $(A^T)^T = A$
2. $(A+B)^T = A^T + B^T$
3. $(\lambda A)^T = \lambda A^T$
4. $(AB)^T = B^TA^T$
### **Symmetric Matrices**
$A^T = A$
If $A, B \in M_n(\mathbb{C})$ are symmetric, and $\lambda \in \mathbb{C}$, then\
1. If $A$ is invertible, $A^{-1}$ is symmetric
2. $A+B$ is symmetric
3. $\lambda A$ is symmetric
4. if $AB$ is symmetric, $[A,B]=0$ (they commute)
### **Skew-Symmetric Matrices**
$A^T = -A$
If $A, B \in M_n(\mathbb{C})$ are skew-symmetric, and $\lambda \in \mathbb{C}$, then
1. If $A$ is invertible, $A^{-1}$ is skew-symmetric
2. $A+B$ is symmetric and so is $[A,B]$
3. $A^T$ and $\lambda A$ are symmetric
4. if $AB$ is symmetric, $[A,B]=0$ (they commute)
### **Trace**
Trace of $A \in M_n(\mathbb{C}) = \sum_{i=1}^n a_{ii}$
If $A, B \in M_n(\mathbb{C})$ , and $\lambda \in \mathbb{C}$, then\
1. $Tr(A^T) = Tr(A)$
2. $Tr(A+B) = Tr(A) + Tr(B)$
3. $Tr(\lambda A) = \lambda Tr(A)$
4. $Tr(AB) = Tr(BA)$
---
## Lecture 10 - Matrix Groups - Part 1 | 27-11-2020
### **Multiplicative Group**
$(G, \times)$, where $\times$ is an operator called multiplication. Should satisfy the usual group axioms.
- The inverse element is unique.
### **Subgroups**
$H \subseteq G$ is a subgroup of $G$ iff:
- $e \in H$ where $e$ is the identity element of $G$
- $AB \in H, \:\:\: \forall A, B \in H$
- If $A \in H$ then $A^{-1} \in H$
### **Group Homomorphisms**
Let $G_1, G_2$ be groups. $f: G_1 \to G_2$ is a homomorphism if $f(AB) = f(A)f(B) \;\;\;\forall A, B \in G_1$
If $f$ is a bijection, $f$ is called a *group homomorphism*, and we write $G_1 \simeq G2$.
**Image of a homomorphism**: $Im_f = \{f(A) | A \in G_1\}$
**Kernel of a homomorphism**: $Ker_f = \{A \in G_1 | f(A) = e_2\}$
### **Group Commutators**
Let $G$ be a group. The commutator $[A,B]$ where $A, B \in G$ is defined as $[A, B] = A^{-1}B^{-1}AB$.
If $[,]$ is a commutator on $G$, then
- $G$ is abelian if $[A, B] = e \;\;\forall A,B \in G$
- $[B, A] = [A, B]^{-1}$
- $[A, BC] = [A, C][A, B][[A, B], C]$
The commutator is a measure of how abelian a group is. For interesting applications of group theory, the less abelian the better.
### **Group Center**
Let $G$ be a group. The center of $G$ is defined as $Z(G) = \{A \in G \;|\; [A,B]=e \;\forall B \in G\}$
- Saying $[A,B]$ is equivalent to saying $AB=BA$.
- The center $Z(G)$ is a subgroup of $G$.
- The smaller the center, the less abelian the group.
### **Matrix Determinants**
Denoted by $det(A)$ or $|A|$.
For $2\times 2$ matrices, $|A| = a_{11}a_{22} - a_{12}a_{21}$.
For diagonal matrices, the det = the product of diagonal elements.
**Properties**\
If $A, B \in M_2(\mathbb{C}) \;and\; \lambda \in \mathbb{C}$
- $|A^T| = |A|$
- $|\lambda A| = \lambda^2 |A|$ (since $A \in M_2(\mathbb{C})$)
- $|AB| = |A||B|$
- If $A$ is invertible, $|A^{-1}| = \frac{1}{|A|}$
The following statements are equivalent:
- $|A| \neq 0$
- $A$ is invertible
- $Rank(A) = n$ if $A \in M_n(\mathbb{C})$
- The rows of A are linearly independent
### **Matrix Inverse**
$A^{-1} = \frac{1}{|A|}Adj(A)$\
If $A \in M_2(\mathbb{C})$, then $A^{-1} = \frac{1}{|A|}\begin{pmatrix}a_{22}&-a_{12}\\-a_{21}&a_{11}\end{pmatrix}$
### **General Linear Group**
$GL(n, \mathbb{C}) = \bigg\{A \in M_n(\mathbb{C}) \;\vert\; det(A) \neq 0\bigg\}$
- Is an example of a Lie Group.
$GL_2(\mathbb{C})$ is a group under matrix multiplication in $M_2(\mathbb{C})$
**Comments**
- $AB \in GL_2(\mathbb{C}) \;\forall\; A,B \in GL_2(\mathbb{C})$. i.e if $det(A)$ and $det(B)$ are non-zero, then $det(AB)$ is non-zero.
- Identity Matrix $I \in GL_2(\mathbb{C})$ is the group identity element.
- $A(BC) = (AB)C \;\forall\; A,B,C \in GL_2(\mathbb{C})$.
- $\forall A \in GL_2(\mathbb{C})$, there exists $A^{-1} \in GL_2(\mathbb{C})$ such that $AA^{-1} = A^{-1}A = I$.
**Center of $GL_2(\mathbb{C})$**\
$Z(GL_2(\mathbb{C})) = \bigg\{\begin{pmatrix}\lambda & 0 \\ 0 & \lambda\end{pmatrix} \;\vert\; \lambda \in \mathbb{C}^*\bigg\}$ where $\mathbb{C}^* = \mathbb{C} - 0$ (set of non-zero complex numbers).\
The center is clearly very small, so the group is highly non-abelian.
### **Special Linear Group**
$SL_n(\mathbb{C}) = \bigg\{A \in GL_n(\mathbb{C}) \;\vert\; det(A) = 1\bigg\}$
Is a sub-group of $GL_n(\mathbb{C})$.
### **Determinant as a Homomorphism**
The map $det : GL_n(\mathbb{C}) \to \mathbb{C}^*$ is a group homomorphism, i.e. $det(AB) = det(A)det(B) \;\forall\; A,B \in G$.
---
## Lecture 11 - Matrix Groups - Part 2 | 04-12-2020
### **Matrix Conjugate**
For a matrix $A \in M_n(\mathbb{C}), A^*$ is formed by taking the complex conjugate of every element.
**Conjugate Transpose**\
Take the conjugate and then transpose the matrix. Represented by $A^\dagger$. Also known as the *Hermitian Transpose* of the matrix.\
$A$ is called *Hermitian* if $A = A^\dagger$, and *skew-Hermitian* if $A = -A^\dagger$
**Identities**
- $(A^\dagger)^\dagger = A$
- $(\lambda A)^\dagger = \lambda^* A^\dagger$
- $(A+B)^\dagger = A^\dagger + B^\dagger$
- $(AB)^\dagger = B^\dagger A^\dagger$
- $det(A^\dagger) = det(A)^*$
- If $A$ is invertible, $A^\dagger$ is invertible
Let $A, B \in M_n(\mathbb{C})$
- $A + A^\dagger$ is Hermitian, $A - A^\dagger$ is skew-Hermitian
- $AA^\dagger$ and $A^\dagger A$ are Hermitian
- If $A$ is Hermitian and invertible, $A^\dagger$ is Hermitian
- If $A$ and $B$ are Hermitian / skew-Hermitian, then $\alpha A + \beta B$ is Hermitian / skew-Hermitian $\;\forall\; \alpha,\beta \in \mathbb{R}$
- If $A$ is Hermitian, the entries on the primary diagonal are all real
- If $A$ is skew-Hermitian, $iA$ is Hermitian and vice-versa
### **Hermitian and skew-Hermitian decomposition**
Let $A \in M_n(\mathbb{C}$, then $A = \frac{1}{2}(A + A^\dagger) + \frac{1}{2}(A - A^\dagger)$, where $\frac{1}{2}(A + A^\dagger) = H(A)$ is called the Hermitian part of $A$, and $\frac{1}{2}(A - A^\dagger) = S(A)$ is called the skew-Hermitian part of $A$.
### **Circle Group**
Let $C^* = \bigg\{z \in \mathbb{C} \;\vert\; z \neq 0\bigg\}$. $C^*$ is an abelian group under multiplication in $\mathbb{C}$.
The circle group $U(1)$ is defined as $U(1) = \bigg\{\lambda \in \mathbb{C}^* \;\vert\; |\lambda | = 1\bigg\}$.\
It is a subgroup of $C^*$ and can be identified with the unit circle on the complex plane.\
The elements of this group are of the form $e^{i\theta}$.\
\
Group multiplication in $U(1)$ can be defined for 2 elements $A = e^{i\theta_1}$ and $B = e^{i\theta_2}$ as $AB = e^{i(\theta_1 + \theta_2)}$.
$U(1)$ is one of the simplest examples of a Lie Group. It is important in particle physics because it is an example of what theoretical physicists call [[Gauge Symmetry]]. It is an important part of the gauge group for the Standard Model of Physics.
### **Unitary Group**
$U(n) = \bigg\{ A \in GL(n,\mathbb{C}) \;|\; AA^\dagger = A^\dagger A = I \;i.e.\; A^\dagger = A^{-1}\bigg\}$\
The Pauli Matrices $X, Y, Z$ are part of $U(2)$.
Elements of $U(n)$ are called Unitary Matrices/Operators when they act on $\mathbb{C}^n$.\
$U(n)$ is a subgroup of $GL(n, \mathbb{C})$.
**Center of $U(n)$**
$Z(U(n)) = \bigg\{ \lambda I_n \;|\; \lambda \in U(1) \bigg\}$
### **Special Unitary Group**
$SU(n) = \bigg\{ A \in U(n) \;|\; det(A) = 1\bigg\}$\
$SU(n)$ is a subgroup of $U(n)$.\
$SU(n) = U(n) \cap SL(n, \mathbb{C})$
**Center of $SU(n)$**
$Z(SU(n)) = \bigg\{ \lambda I_n \;|\; \lambda \in U(1) : \lambda^2 = 1 \bigg\}$
---
## Lecture 12 - Finite Dimensional Hilbert Spaces - Part 1 | 11-12-2020
### **Linear Independence**
Let $V$ be a vector space over $\mathbb{C}$. The vectors $|\psi_1\rangle \dots |\psi_n\rangle$ are linearly independent if $\sum_{i=1}^n \alpha_i|\psi_i\rangle = 0 \implies \alpha_1 = \alpha_2 = \dots = \alpha_n = 0$.
### **Basis**
A subset $B = \big\{|e_1\rangle \dots |e_n\rangle\big\}$ of $V$ forms a Hamel basis if
1. $|e_1\rangle \dots |e_n\rangle$ are linearly independent
2. The elements of $B$ span $V$, i.e. any element in $V$ can be written as a linear combination of elements in $B$.
**Dimension**\
The cardinality of a basis set ($B$) is called the dimension of $V$ and is denoted by $dim(V)$ or $dimV$.
Any 2 bases of $V$ will always have the same number of elements, i.e. the dimension is independent of choice of basis.
### **Isomorphism Theorems**
1. If $V$ and $W$ are 2 vector spaces over $\mathbb{C}$, then $V \simeq W \iff dim(V) = dim(W)$
2. If $V$ is a vector space over $\mathbb{C}$ and $dimV = n$ then $V \simeq \mathbb{C}^n$
### **Axioms of Quantum Mechanics (Non-Relativistic)**
1. States of quantum systems are modeled by normalised vectors on separable complex Hilbert spaces.
2. The observables of quantum systems are modeled by self-adjoint operators on separable complex Hilbert spaces.
- In finite-dimensional Hilbert spaces, self-adjoint operators correspond to Hermitian matrices.
3. Given a state vector $\lvert\psi\rangle \in \mathscr{H}$ that encodes a a state of a quantum system, and a self-adjoint operator $A: \mathscr{H} \to \mathscr{H}$ that encodes an observable of the same system, the measurement value of the observable encoded in $A$ in the state encoded in $\lvert\psi\rangle$ is given by $\langle A\rangle = \langle\psi,A\psi\rangle$, also represented in Dirac Notation as $\langle\psi\rvert A \lvert\psi\rangle$
**Why Hilbert Spaces?**\
There used to be 2 competing theories of Quantum Mechanics -
1. **Schrodinger's Wave Mechanics** which emphasised the wave function. These wave functions were elements of $L^2(\mathbb{R}^3)$, the space of square integrable functions.
2. **Heisenberg's Matrix Mechanics** which emphasised observables that were encoded as matrices built from $l^2(\mathbb{N})$, the space of square summable sequences.
It was later found, by von Neumann, that both these spaces are complex Hilbert spaces and are isomorphic.
### **Inner Product Space**
Let $V$ be a vector space over $\mathbb{C}$. An inner product on $V$ is a map $\langle\cdot,\cdot\rangle : V \times V \to \mathbb{C}$ such that
1. $\langle\psi , \psi\rangle \geq 0$ and $\langle\psi , \psi\rangle = 0 \iff \psi = 0_V$
2. $\langle\psi_1 , \psi_2\rangle = \langle\psi_2 , \psi_1\rangle^*$
3. $\langle\alpha\psi_1 + \beta\psi_2 , \psi_3\rangle = \alpha\langle\psi_1 , \psi_3\rangle + \beta\langle\psi_2 , \psi_3\rangle$
where all $\psi_i \in V$.
The pair $\big(V, \langle\cdot,\cdot\rangle\big)$ is called an inner product space and is ofter abbrevaited to $V$ if the inner product is understood from context.
### **Induced Normed Spaces**
Given an inner product $\langle\cdot,\cdot\rangle$ on $V$, the induced norm for any $\psi \in V$ is defined as $\Vert\psi\Vert = \sqrt{\langle\psi,\psi\rangle}$
**Properties of norms**
1. $\Vert\psi\Vert \geq 0$ and $\Vert\psi\Vert = 0 \iff \psi = 0_V$
2. $\Vert\alpha\psi\Vert = \vert\alpha\vert\Vert\psi\Vert$ where $\alpha \in \mathbb{C}$
3. $\Vert\psi_1 + \psi_2\Vert \leq \Vert\psi_1\Vert + \Vert\psi_2\Vert$
where all $\psi_i \in V$
$V$ together with the norm $\Vert\cdot\Vert$ is called a normed vector space.
**More Properties of norms**
1. Let $\Vert\cdot\Vert$ be a norm induced by an inner product. Then:
- $\vert\langle\psi_1,\psi_2\rangle\vert \leq \Vert\psi_1\Vert\Vert\psi_2\Vert$ with equality only when $\psi_1$ and $\psi_2$ are linearly dependent.
2. A norm $\Vert\cdot\Vert$ on $V$ is induced by an inner product $\langle\cdot,\cdot\rangle \iff \;\forall\; \psi_1, \psi_2 \in V, \Vert\psi_1+\psi_2\Vert^2 + \Vert\psi_1-\psi_2\Vert^2 = 2(\Vert\psi_1\Vert^2 + \Vert\psi_2\Vert^2)$.
- This is called the **parallelogram identity**
3. If a norm $\Vert\cdot\Vert$ on $V$ satisfies the parallelogram identity, then $\vert\langle\psi_1,\psi_2\rangle\vert = \frac{1}{4}\bigg(\Vert\psi_1+\psi_2\Vert^2 - \Vert\psi_1-\psi_2\Vert^2 + i\Vert\psi_1-i\psi_2\Vert^2 -i \Vert\psi_1+i\psi_2\Vert^2\bigg)$
- This is called the **polarization identity**
### **Complete Normed Spaces**
A norm $\Vert\cdot\Vert$ on $V$ is said to be complete if there exists a map(sequence) $\phi : \mathbb{N} \to V$ such that $\;\forall\;epsilon > 0, \exists \kappa\in\mathbb{N} \lt$ such that $\Vert(\phi(n_1) - \phi(n_2)\Vert \lt \epsilon \;\forall\; n_1,n_2 \geq \kappa$.
A sequence satisfying the above condition is called a **Cauchy Sequence**.
$V$ is called a **Banach Space** if its norm is complete.
$\mathbb{C}^n$ and $M_n(\mathbb{C})$ are Banach spaces.
### **Complex Hilbert Spaces**
A complex Hilbert space is a vector space over $\mathbb{C}$ denoted by $\mathscr{H}$ with an inner product $\langle\cdot,\cdot\rangle$ that induces a complete norm $\Vert\cdot\Vert$ on $V$.
It is a Banach space with respect to the induced norm.
$\mathbb{C}^n$ and $M_n(\mathbb{C})$ are complex Hilbert spaces.
Any finite dimensional complex Hilbert space of dimension $n$ is isomorphic to $\mathbb{C}^n$.
---
## Lecture 13 - Finite Dimensional Hilbert Spaces - Part 2 | 18-12-2020
### **Normalized Vectors**
A vector $\psi \in \mathscr{H}$ is normalized if $\Vert\psi\Vert = 1$.
### **Orthogonal Vectors**
$\psi_1, \psi_2 \in \mathscr{H}$ are orthogonal if $\langle\psi_1,\psi_2\rangle = 0$
### **Orthogonal Compliment**
If $W$ is a linear subspace of $\mathscr{H}$, then its orthogonal compliment is defined as $W^\perp = \big\{\psi \in \mathscr{H} \;|\; \langle\psi,\phi\rangle = 0 \;\forall\;\phi \in W\big\}$
- $W^\perp$ is also a linear subspace of $\mathscr{H}$
- $(W^\perp)^\perp = W$
### **Orthonormal Basis**
A basis $B = \big\{e_1 \dots e_n\big\}$ of $\mathscr{H}$ is an orthonormal basis if $\vert e_i\Vert = 1 \;\forall\; i=1\dots n$ and $\langle e_i,e_j \rangle = 0 \;\forall\; i \neq j$
Since a Hilbert space has more than just the linear structure, the bases are called **Schauder Bases** instead of **Hamel Bases**.
### **Separable Hilbert Space**
$\mathscr{H}$ is separable if it has a countable orthonormal basis $B = \big\{e_1 \dots e_n\big\}$
If $B$ above is an orthonormal basis of $\mathscr{H}$, then
1. $\psi = \sum_{i=1}^n \langle e_i,\psi \rangle e_i \;\forall\; \psi \in \mathscr{H}$
2. $\langle e_i,\psi \rangle = 0 \;\forall\; i \iff \psi = 0_\mathscr{H}$
3. $\Vert\psi\Vert^2 = \sum_{i=1}^n \Vert\langle e_i,\psi \rangle\Vert^2 \;\forall\; \psi \in \mathscr{H}$
### **Linear Functionals Induced by the Inner Product**
For any $\psi \in \mathscr{H}$ we can construct a map $L_\psi : \mathscr{H} \to \mathbb{C}$ induced by the inner product as $L_\psi \phi = \langle \psi,\phi \rangle \;\forall\; \phi \in \mathscr{H}$. This map is a linear map (linear functional).
****
- $L_\phi$ is sometimes denoted as $\langle\psi|$ (usually by physicists) and called the **bra**.
- Using the Dirac Notation, $\phi$ is placed inside the ket to get $|\phi\rangle$. The action of $L_\psi = \langle\psi|$ on $|\phi\rangle$ is then denoted as $\langle\psi\Vert\phi\rangle$ which is abbreviated to $\langle\psi\vert\phi\rangle$.
### **Concrete Representation of Bras**
If $\mathscr{H} = \mathbb{C}^n$ and $|\psi\rangle = \begin{pmatrix}x_1\\x_2\\\vdots\\x_n\end{pmatrix}$, then the bra $\langle\psi|$ just becomes the row matrix $\big(x_1^* x_2^* \dots x_n^* \big)$
$\langle\psi\Vert\psi\rangle$ is then the same as multiplying the row vector (bra) by the column vector (ket).
### **Dual Space**
The set of all linear functionals $L : \mathscr{H} \to \mathbb{C}$ is called the dual space of $\mathscr{H}$ and is denoted using $\mathscr{H}^*$.
### **Riesz Representation**
Let $\mathscr{H}$ be a Hilbert space and $L \in \mathscr{H}^*$. Then, there exists a unique $\psi \in \mathscr{H}$ such that $L = \langle\psi, \cdot\rangle$ i.e. $L\phi = \langle\psi,\phi\rangle \;\forall\; \phi \in \mathscr{H}$.
---
## Lecture 14 - Quantum Axioms and Operators - Part 1 | 8-1-2021
### **Unitary Operators**
$A \in M_n(\mathbb{C})$ is unitary if $AA^\dagger = A^\dagger A = I$
### **Matrix Similarity**
Let $A, B \in M_n(\mathbb{C})$. $B$ is similar to $A$ if there exists a matrix $S \in GL(n, \mathbb{C})$ such that $B = S^{-1}AS$. It is denoted as $B \sim A$.
The similarity relationship $\sim$ is an equivalence relation in $M_n(\mathbb{C})$.
---
## Lecture 15 - Quantum Axioms and Operators - Part 2 | 15-1-2021
- For systems modelled with finite or infinite dimensional Hilbert spaces, the observables are encoded in self-adjoint operators.
- But in the case of finite-dimensional Hilbert spaces, Self-adjoint operators are Hermitian matrices.
- Alternate definition for Hermitian matrices: $\langle\ A\psi_1 , \psi_2\rangle = \langle\psi_1, A\psi_2\rangle \;\forall\; \lvert\psi_1\rangle, \lvert\psi_2\rangle \in \mathscr{H} = \mathbb{C}^n$, where $\langle,\cdot,\rangle$ is the inner product in $\mathscr{H}$.
### **Eigenvectors and Eigenvalues**
Let $A: \mathscr{H} \to \mathscr{H}$ be a linear operator. $\lvert\psi\rangle \in \mathscr{H}$ is an eigenvector of $A$ if there exists $\lambda \in \mathbb{C}$ such that $A\vert\psi\rangle = \lambda\lvert\psi\rangle$. $\lambda$ is called an eigenvalue of $A$.
- If $\lvert\psi\rangle$ is an eigenvector of $A$ with eigenvalue $\lambda$, then $\lvert\psi\rangle$ is also an eigenvector of $A^n$ with eigenvalue $\lambda^n$.
- If $A \in GL(n,\mathbb{C})$ then $\lvert\psi\rangle$ is also an eigenvector of $A$ with eigenvalue $\frac{1}{\lambda}$.
### **Spectrum**
Is the set of all eigenvalues of an operator.
Denoted by $\sigma$; $\sigma(A) = \{\lambda \in \mathbb{C} \;\vert\; A\lvert\psi\rangle = \lambda\lvert\psi\rangle\}$ where $\lvert\psi\rangle \in \mathbb{H}$ is an eigenvector of $A$.
This definition is technically known as the *point spectrum* or *discrete spectrum*.
If $A \in GL(n\mathbb{C})$ and we know $\sigma(A)$ then we also know $\sigma(A^{-1})$ since the eigenvalues of $A^{-1}$ will just be the reciprocals of the eigenvalues of $A$.
**Theorem**\
If $A$ is Hermitian, then $\sigma(A) \subseteq \mathbb{R}$, i.e. the eigenvalues are all real-valued.
### **Characteristic Polynomial**
Denoted by $p_A(z)$; $p_A(z) = det(z\mathbb{I}-A)$ where $A \in M_n(\mathbb{C})$ and $z \in \mathbb{C}$. It is a polynomial of degree $n$ over $\mathbb{C})$.
Let $A \in M_n(\mathbb{C})$. Then $\lambda \in \sigma(A) \iff p_A(\lambda) = 0$ i.e. $\lambda$ is a zero of the characteristic polynomial.
**Interesting property**: $p_A(z) = p_B(z)$ for $A,B \in M_n(\mathbb{C})$ such that $A \sim B$. If $A \sim B$, they have the same eigenvalues.
### **$\lambda$-eigenspace**
For an operator $A$ acting on $\mathscr{H}$, the set of all its eigenvectors $\lvert\psi\rangle$ for which $\lambda$ is en eigenvalue is called the $\lambda$=eigenspace and is denoted by $E_\lambda$.
$E_\lambda = \{\lvert\psi\rangle \in \mathscr{H} \;\vert\; A\lvert\psi\rangle = \lambda\lvert\psi\rangle\}$
$E_\lambda$ is a subspace of $\mathscr{H}$ and its dimension is called the *geometric multiplicity* of $\lambda$.
The number of times $\lambda$ appears as an eigenvalue is called its *algebraic multiplicity*.
### **Matrix Diagonalization**
$A \in M_n(\mathbb{C})$ is diagonalizable if $A \sim D$ where $D \in M_n(\mathbb{C})$ is a diagonal matrix.
**The following statements are equivalent**:
- $A$ is diagonalizable
- There is a linearly independent set of vectors $B = \{\lvert b_1 \rangle \dots \lvert b_n\rangle\}$ such that $\lvert b_i\rangle$ is an eigenvector of $A$ for all $i = 1 \dots n$
- $A$ has $n$ distinct eigenvalues $\lambda_1 \dots \lambda_n$
### **Spectral Theorem**
$A \in M_n(\mathbb{C})$ is Hermitian iff there exists a unitary matrix $U \in M_n(\mathbb{C})$ and a real diagonal matrix $D \in M_n(\mathbb{C})$ such that $A = UDU^\dagger$
This means that *Hermitian matrices are diagonalizable*.
The eigenvectors of any Hermitian $A \in M_n(\mathbb{C})$ form an orthonormal basis in $\mathscr{H} = \mathbb{C}^n$
- The eigenvectors of $A$ coincide with the columns of $U$
- The eigenvalues of $A$ coincide with the diagonal elements of $D$
From the 3rd [axiom of Quantum Mechanics](#axioms-of-quantum-mechanics-non-relativistic) and some mathematical creativity, it can be deduced that if $\sigma(A)$ is discrete, then the only possible measurement values for an observable encoded by a matrix $A$ are its eigenvalues.
---
## **Resources**
1. [Official GitHub repo for the course](https://github.com/quantumformalism/2020-math-lectures)
2. [Charles Pinter's Book](http://www2.math.umd.edu/~jcohen/402/Pinter%20Algebra.pdf)
3. [Jim Coykendall's Notes](http://jcoyken.people.clemson.edu/conference.html)