# $$\textbf{Orthogonal Sets}$$
When working with subspace $W$ of $\mathbb{R}^n$, one of the more interesting ideas is that we can $\textbf{generate}$ the entire space $W$ using only a finite number of vectors. We call such a set of vectors that accomplishes this a $\textit{basis}$ for $W$. Off course there is a multitude of set of vectors that can accomplish this task, so one might wonder if there is prefered set of vectors to work with. As it turns out, there is very interesting set of vectors that makes calculations much easier and thus and ideal pick as a basis. We look at this set and it's propeties below.
Let $S= \{\vec u_1, \vec u_2, \cdots , \vec u_p \}$ be a set of vectors in $\mathbb{R}^n$. Then, we say that $S$ is an $\textbf{Orthogonal Set}$ if the set of vectors in $S$ are pairwise $\textit{orthongonal}$. That is, $S$ is an $\textbf{Orthogonal Set}$ provided that if $1 \leq i \neq j \leq p$,
$$\vec u_i \cdot \vec u_j = 0$$.
In essense, a set of vectors is considered orthogonal if each distince pair of vectors is $\textit{Orthogonal to the other}$.
$\textbf{Example 1: Determining if a set of Vectors is Orthogonal}$
Let $S = \{\vec u_1, \vec u_2, \vec u_3\}$. Determine whether $S$ is an Orthongal Set given that
$$\vec u_1 = \begin{bmatrix} \ 1 \\ 1 \\ 1 \\ \end{bmatrix}, \vec u_2 = \begin{bmatrix} \ -1 \\ 0 \\ 1 \\ \end{bmatrix}, \vec u_3 = \begin{bmatrix} \ 1 \\ -2 \\ 1 \\ \end{bmatrix}.$$
To see whether $S$ is Orthogonal, we need to see whether each disticnt pair of vectors is orthogonal to the other. We see then that
$$a) \vec u_1 \cdot \vec u_2 = (1)(-1) +(1)(0) + (1)(1).$$
$$\vec u_1 \cdot \vec u_2 = (-1) +(0) + (1).$$
$$\vec u_1 \cdot \vec u_2 = 0.$$
Thus, vectors $\vec u_1$ and $\vec u_2$ are orthogonal to the other.
$$b) \vec u_1 \cdot \vec u_3 = (1)(1) +(1)(-2) + (1)(1).$$
$$\vec u_1 \cdot \vec u_3 = (1) +(-2) + (1).$$
$$\vec u_1 \cdot \vec u_3 = 0.$$
Thus, vectors $\vec u_1$ and $\vec u_3$ are orthogonal to the other.
$$c) \vec u_2 \cdot \vec u_3 = (-1)(1) +(0)(-2) + (1)(1).$$
$$\vec u_2 \cdot \vec u_3 = (-1) +(0) + (1).$$
$$\vec u_2 \cdot \vec u_3 = 0.$$
Thus, vectors $\vec u_2$ and $\vec u_3$ are orthogonal to the other.
What is fascinting about this type of set of vectors is that it immediately satisfies the condition nessary to be a basis for subspace: linear independence. This fact can be easily demonstrated in the proof of the following theorem.
$\textbf{Theorem 1}$: If the set $S= \{\vec u_1, \vec u_2, \cdots , \vec u_p\}$ consists of nozero vectors in $\mathbb{R}^n$ and is Orthogonal, then $S$ is linearly independent.
$\textit{Proof}$:
Suppose that $S= \{\vec u_1, \vec u_2, \cdots , \vec u_p\}$ is subset of $\mathbb{R}^n$ and that $S$ is Orthogonal. The latter fact would mean that for every vector $\vec u_i$ and $\vec u_2$ that are in $S$
$$u_i \cdot u_j = 0$$
where $i \neq j$.
Now, because the $\vec 0 \in \mathbb{R}^n$ is in the span of $\textit{every}$ set of vectors in $\mathbb{R}^n$ we see that there exists scalars $c_1, c_2, \cdots, c_p$ such that
$$c_1 \vec u_1 + c_2 \vec u_2 + \cdots + c_p \vec u_p = \vec 0.$$
It follows then that
$$(c_1 \vec u_1 + c_2 \vec u_2 + \cdots + c_p \vec u_p) \cdot \vec u_1 = (\vec 0) \cdot \vec u_1.$$
$$(c_1 \vec u_1) \cdot \vec u_1 + (c_2 \vec u_2) \cdot \vec u_1 + \cdots + (c_p \vec u_p) \cdot \vec u_1 = (\vec 0) \cdot \vec u_1.$$
$$c_1 (\vec u_1 \cdot \vec u_1) + c_2 (\vec u_2 \cdot \vec u_1) + \cdots + c_p (\vec u_p \cdot \vec u_1) = (\vec 0) \cdot \vec u_1.$$
$$c_1 (\vec u_1 \cdot \vec u_1) + c_2 (0) + \cdots + c_p (0) = (0).$$
$$c_1(\vec u_1 \cdot \vec u_1) = 0.$$
We observe that the product $\vec u_1 \cdot \vec u_1$ must be greater that 1. This is justified by the fact that dot product of two vectors are the sum of the product of corresponding entries of the vectors. Consequently, we see that $c_1 = 0$. A similar arguement establishes this property for all $\vec u_i \in S$, where $1 \leq i \leq p$.
This proves that $S$ is a lineary independent set.
Off course, since $S$ is linearly independent, it follows that it forms the basis for the $Span\{\vec u_1, \vec u_2, \cdots , \vec u_p \}$. We call such a basis an $\textbf{Ortogonal Basis}$.
It should be noted that the condition from $\textbf{Theorem 1}$ that $S$ contain $\textit{non-zero}$ vectors is not immaterial. Firstly, it is clear that set containing the zero vector and other nozero vectors will NOT be linearly independent. It turns out however, that the set containing just the zero vector meets the conditions of the rest of the theorem trivally. This strange fact comes from observation that the set consisting of the zero vector is actually an $\textit{Orthogonal Set}$. We can see this below.
Let $\vec 0 \in \mathbb{R}^n$ and $\vec x \in \mathbb{R}^n$ where
$$\vec x = \begin{bmatrix} \ x_1 \\ x_2 \\ \vdots \\ x_n \end{bmatrix}.$$
We see then that
$$\vec x \cdot \vec 0 = \begin{bmatrix} \ x_1 \\ x_2 \\ \vdots \\ x_n \end{bmatrix} \cdot \begin{bmatrix} \ 0 \\ 0 \\ \vdots \\ 0 \end{bmatrix}.$$
$$\vec x \cdot \vec 0 = \begin{bmatrix} \ x_1 \\ x_2 \\ \vdots \\ x_n \end{bmatrix} \cdot \begin{bmatrix} \ 0 & 0 & \cdots & 0 \end{bmatrix}.$$
$$\vec x \cdot \vec 0 = (x_1 \cdot 0) + (x_2 \cdot 0) + \cdots (x_n \cdot 0).$$
$$\vec x \cdot \vec 0 = 0$$.
Thus, $\vec 0$ is orthogonal to all vectors $\vec x \in \mathbb{R}^n$. Consequently, $\{\vec 0 \}$ is an Orthogonal Set.
# Exam Question:
Determine whether the following set forms an Orthogonal Basis for $\mathbb{R}^3$. $S = \{\vec v_1, \vec v_2, \vec v_3 \}$ where
$$\vec v_1 = \begin{bmatrix} \ 2 \\ 0 \\ 2 \\ \end{bmatrix}, \vec v_2 = \begin{bmatrix} \ 0 \\ 1 \\ -9 \\ \end{bmatrix}, \vec v_3 = \begin{bmatrix} \ 1 \\ -2 \\ -1 \\ \end{bmatrix}.$$
$$\textit{Citations}$$
He, J. (n.d.). Linear Algebra - 6.2 Orthogonal Sets. Retrieved from https://www.math.uh.edu/~jiwenhe/math2331/lectures/sec6_2.pdf.
Lay, D. C. (2006). Linear Algebra and its Applications. Reading: Pearson AddisonWesley Publishing Company
Margalit, D., & Rabinoff, J. (n.d.). Interactive Linear Algebra. Retrieved from https://textbooks.math.gatech.edu/ila/orthogonal-sets.html.
Slavko. (n.d.). MATH 304 Linear Algebra Lecture 29: Orthogonal sets. The Gram-Schmidt process. Retrieved from https://www.math.tamu.edu/~yvorobet/MATH304-504/Lect3-06web.pdf.