# Intuitively Understanding Matrix Multiplication and Positive (Semi) Definite Matrices
## Introduction
This article provides an intuitive grasp of matrix multiplication as a linear transformation (or change of basis/coordinate system) and the concept of positive (semi) definite matrices. Using simple examples and visualizations, we’ll uncover the geometric meaning behind these core linear algebra ideas.
## Matrix Multiplication as Change of Basis
Multiplying a vector by a matrix performs a **linear transformation**, which can be viewed as a **change of basis** or **change of coordinate system**. The matrix’s columns become the new basis vectors. In 2D, the first column defines the new $\vec{i}$ basis (x-coordinate), and the second defines the new $\vec{j}$ basis (y-coordinate).
### Example: Identity Matrix
Consider the identity matrix:
$$
I = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}.
$$
Its columns are $\vec{i} = [1, 0]$ and $\vec{j} = [0, 1]$, the standard basis. Take a vector $\mathbf{v} = [1, 1]$:
$$
I \mathbf{v} = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} \begin{bmatrix} 1 \\ 1 \end{bmatrix} = \begin{bmatrix} 1 \\ 1 \end{bmatrix}.
$$
The vector remains unchanged because the basis remains the familiar coordinate system.

### Rotating the Basis
What if we want to rotate $\mathbf{v} = [1, 1]$ by $90^{\circ}$ counterclockwise?
We can redefine the basis by rotating $\vec{i} = [1, 0]$ to $[0, 1]$ and $\vec{j} = [0, 1]$ to $[-1, 0]$. The resulting matrix is:
$$
A = \begin{bmatrix} 0 & -1 \\ 1 & 0 \end{bmatrix}.
$$
This aligns with the 2D rotation matrix:
$$
\begin{bmatrix} \cos \theta & -\sin \theta \\ \sin \theta & \cos \theta \end{bmatrix},
$$
where $\theta = 90^{\circ}$ ($\cos 90^\circ = 0$, $\sin 90^\circ = 1$). Compute:
$$
A \mathbf{v} = \begin{bmatrix} 0 & -1 \\ 1 & 0 \end{bmatrix} \begin{bmatrix} 1 \\ 1 \end{bmatrix} = \begin{bmatrix} -1 \\ 1 \end{bmatrix}.
$$
The result, $[-1, 1]$, represents a $90^{\circ}$ counterclockwise rotation. Notice that while the basis rotates, the vector maintains the same relative angles with respect to the new basis vectors.

### Linear Transformation Perspective
Expanding the multiplication:
$$
A \mathbf{v} = \begin{bmatrix} 0 & -1 \\ 1 & 0 \end{bmatrix} \begin{bmatrix} v_1 \\ v_2 \end{bmatrix} = v_1 \begin{bmatrix} 0 \\ 1 \end{bmatrix} + v_2 \begin{bmatrix} -1 \\ 0 \end{bmatrix}.
$$
This is a linear combination of $A$’s columns, scaled by $\mathbf{v}$’s components, illustrating why matrix multiplication is a **linear transformation**.
### Summary
Matrix multiplication reinterprets a vector within a new coordinate system, defined by the matrix’s columns. This process can be viewed as a linear transformation, a basis change, or a coordinate shift.
## Positive (Semi) Definite Matrices
### Intuition from Scalars
In 1D, multiplying a number by a scalar adjusts its magnitude and direction. Start with a number/vector $1$ (an arrow pointing right).

Multiply by a positive number, say $2$:
$$
(2) \cdot 1 = 2.
$$
The vector stretches to $2$, still pointing right.

Now multiply by a negative number, $-3$:
$$
(-3) \cdot 1 = -3.
$$
The vector extends to 3 units, pointing left (opposite direction).

Positive scalars preserve direction; negative scalars flip it.
### Extending to Matrices
For a 2D vector, what matrix mimics a positive scalar? A **positive definite matrix**! A matrix $A$ is positive definite if:
$$
\mathbf{x}^T A \mathbf{x} > 0
$$
for all non-zero $\mathbf{x}$. Here, $A \mathbf{x} = \mathbf{x}'$ is the transformed vector, and $\mathbf{x}^T \mathbf{x}'$ is their inner/dot product.
### Inner Product as Projection
The inner product $\mathbf{x}^T A \mathbf{x}$ can be understood as a projection: it measures how much the transformed vector aligns with the original. If the matrix is positive definite, this projection always results in a positive value, meaning the transformed vector retains some alignment with the original in the same direction.
In 2D, this is:
$$
\mathbf{x}^T A \mathbf{x} = |\mathbf{x}| |\mathbf{x}'| \cos \theta.
$$
For $\mathbf{x}^T \mathbf{x}' > 0$, $\theta < 90^\circ$—the vectors must point to the same side, just like a positive scalar in 1D.
#### Example
Consider:
$$
A = \begin{bmatrix} 2 & 2 \\ 1 & 3 \end{bmatrix}, \quad \mathbf{x} = [1, 0].
$$
Transform:
$$
A \mathbf{x} = \begin{bmatrix} 2 & 2 \\ 1 & 3 \end{bmatrix} \begin{bmatrix} 1 \\ 0 \end{bmatrix} = \begin{bmatrix} 2 \\ 1 \end{bmatrix} = \mathbf{x}'.
$$
Inner product:
$$
\mathbf{x}^T \mathbf{x}' = [1, 0] \begin{bmatrix} 2 \\ 1 \end{bmatrix} = 2 > 0.
$$
The angle $\theta$ between $[1, 0]$ and $[2, 1]$ is less than $90^{\circ}$.

In fact, this property holds for any non-zero $\mathbf{x}$, confirming that $A$ is positive definite.
### Positive Semidefinite Case
A matrix is **positive semidefinite** if:
$$
\mathbf{x}^T A \mathbf{x} \geq 0,
$$
allowing the inner product to be zero $(\theta = 90^\circ$). The intuition mirrors the positive definite case, but transformations may align vectors perpendicularly.
## Conclusion
Matrix multiplication reinterprets vectors in new coordinate systems through linear transformations or basis changes, as seen in rotations. Positive (semi) definite matrices extend scalar intuition, ensuring transformed vectors retain directional alignment with the original. For further exploration, check out **3Blue1Brown’s** [''Essence of Linear Algebra''](https://www.youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab) or **Gilbert Strang’s** [tutorial](https://www.youtube.com/watch?v=xsP-S7yKaRA&t=189s).
## References
- [3Blue1Brown: Essence of Linear Algebra](https://www.youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab)
- [MIT OpenCourseWare: Positive Definite and Semidefinite Matrices by Gilbert Strang](https://www.youtube.com/watch?v=xsP-S7yKaRA&t=189s)