Chapter 5 extra note 5

Selected lecture notes

Rank–Nullity Theorem
Four fundamental subspaces
least squares solutions
solvability condition

Rank–nullity theorem

Remarks:
It should be clear that, given a linear transformation

T:VW,
Ker(T)Ker(T)=V,Ran(T)Ran(T)=W.

Proposition:
Let

V=W1W2, where
W1
and
W2
are finite-dimensional spaces, then
dim(V)=dim(W1)+dim(W2).

  • Proof:

    (Hint of the proof)
    Let

    {u1,,um} be a basis of
    W1
    and
    {v1,,vn}
    be a basis of
    W2
    .
    Then show that
    {u1,,um,v1,,vn}
    is a basis of
    V
    .

Proposition:
Let

T:VW be linear, then
dim(V)=dim(Ker(T))+dim(Ker(T)),

and
dim(W)=dim(Ran(T))+dim(Ran(T)).

The 'essential' part of a linear transformation

Let us define a map that is a restriction of

T to the domain
Ker(T)
and its range
Ran(T)
:
T~:Ker(T)VRan(T)W.

Lemma:

T~ is an isomorphism.

  • Proof:

    (linearity) Trivial.

    (1-1) claim:

    Ker(T~)={0}.

    Since

    Ker(T~) is a vector subspace,
    0Ker(T~)
    .
    So
    {0}Ker(T~)
    .

    Suppose

    uKer(T~), then we must have
    uKer(T)
    and
    T~(u)=0
    .
    So,
    T(u)=0
    and hence
    uKer(T)
    .
    Therefore,
    u(Ker(T)Ker(T))={0}
    .
    That gives
    Ker(T~){0}
    .

    (onto)
    claim: Given

    wRan(T), there exists a
    uKer(T)
    such that
    T~(u)=w
    .

    Given

    wRan(T), there exists
    vV
    such that
    T(v)=w
    .
    Since
    vV
    , there exists
    uKer(T)
    and
    vkKer(T)
    such that
    v=u+vk
    .
    We then have
    w=T(v)=T(u+vk)=T(u)+T(vk)=T(u)=T~(u).

Corollary:

  1. Ker(T)
    and
    Ran(T)
    are isomorphic.
  2. dim(Ker(T))=dim(Ran(T))
    .

Rank–Nullity Theorem:
Let

T:VW be linear, then
dim(V)=dim(Ker(T))+dim(Ran(T)).

  • Proof:

    dim(V)=dim(Ker(T))+dim(Ker(T))=dim(Ker(T))+dim(Ran(T)).

Four fundamental subspaces

Theorem:

  1. Ker(T)=Ran(T)
    .
    • Proof:

      Let

      xRan(T).
      x,w=0,wRan(T)
      .
      x,T(v)=0,vV
      .
      T(x),v=0,vV
      .
      T(x)=0
      .
      xKer(T)
      .

  2. Ker(T)=Ran(T)
    .
  3. Ran(T)=Ker(T)
    .
  4. Ran(T)=Ker(T)
    .

Corollary:
Let

T:VW be linear, then
Ker(T)Ran(T)=V,Ran(T)Ker(T)=W.

Least squares solution

Let

T:VW be a linear transformation, we consider solving the problem
T(v)=w
, where
w
is a given vector.

  • If
    wRan(T)
    , there exists solutions.
    • If
      wRan(T)
      and
      Ker(T)=0
      , there exists an unique solution.
  • If
    wRan(T)
    , there does not exist any solution.
    • We can define the least squares solution as
      v=argminvVT(v)w22.
      • The minimum value is achieved when
        w
        is projected to
        Ran(T)
        , and hence the least squares solution satisfies
        T(v)=PRan(T)w.
      • The minimum value is achieved when
        w
        is projected to
        Ran(T)
        , and hence
        (wT(v))Ran(T).

        In other words,
        wT(v),T(v)=0,vV.

        We can also rewrite it as
        T(v),T(v)=w,T(v),vV.

        Use the adjoint transformation we obtain
        TT(v),v=T(w),v,vV.

        That is,
        TT(v)=T(w),

        which is often called the normal equation for the least squares solutions.

Solvability condition

Notice that

Ran(T)=Ker(T), which means that if
wRan(T)
, then
wKer(T)
. We then obtain the solvability condition as
w,u=0,uKer(T).