[TOC]
# Differential Equations with boundary problems
## Chapter 2
If the functions $p$ and $g$ are continuous on an open interval $I : \alpha<t<\beta$ containing the
point $t=t_{0},$ then there exists a unique function $y=\phi(t)$ that satisfies the differential
equation
$$
y^{\prime}+p(t) y=g(t)
$$
for each $t$ in $I,$ and that also satisfies the initial condition
$$
y\left(t_{0}\right)=y_{0}
$$
where $y_{0}$ is an arbitrary prescribed initial value.
## Chapter 3 Second Order Linear Equations
### 3.1 Homogeneous Equations with Constant Coefficients
$ay''+by'+c=0$
Characteristic equation $ar^2+br+c=0$
if its two solutions are real and different $y=c_1e^{r_1t}+c_2e^{r_2t}$
**Equations with the dependent variable missing**
*Example*
$t^{2} y^{\prime \prime}+2 t y^{\prime}-1=0, \quad t>0$
$y^{\prime \prime}+t\left(y^{\prime}\right)^{2}=0$
$y^{\prime \prime}+y^{\prime}=e^{-t}$
$t y^{\prime \prime}+y^{\prime}=1, \quad t>0$
$2 t^{2} y^{\prime \prime}+\left(y^{\prime}\right)^{3}=2 t y^{\prime}, \quad t>0$
$t^{2} y^{\prime \prime}=\left(y^{\prime}\right)^{2}, \quad t>0$
**Equations with the Independent Variable Missing**
*Example*
$\begin{array}{ll}{\text { 34. } y y^{\prime \prime}+\left(y^{\prime}\right)^{2}=0} & {\text { 35. } y^{\prime \prime}+y=0} \\ {\text { 36. } y^{\prime \prime}+y\left(y^{\prime}\right)^{3}=0} & {\text { 37. } 2 y^{2} y^{\prime \prime}+2 y\left(y^{\prime}\right)^{2}=1} \\ {\text { 38. } y y^{\prime \prime}-\left(y^{\prime}\right)^{3}=0} & {\text { 39. } y^{\prime \prime}+\left(y^{\prime}\right)^{2}=2 e^{-y}}\end{array}$
### 3.2 Fundamental Solutions of Linear Homogeneous Equations
**Existence and Uniqueness Theorem**
Consider the initial value problem
$$
y^{\prime \prime}+p(t) y^{\prime}+q(t) y=g(t), \quad y\left(t_{0}\right)=y_{0}, \quad y^{\prime}\left(t_{0}\right)=y_{0}^{\prime}
$$
where $p, q,$ and $g$ are continuous on an open interval $I$ . Then there is exactly one
solution $y=\phi(t)$ of this problem, and the solution exists throughout the interval $I$ .
The solution $\phi$ is defined throughout the interval $I$ where the coefficients are continuous and is at least twice differentiable there.
*Example* Find the longest interval in which the solution of the initial value problem
$$
\left(t^{2}-3 t\right) y^{\prime \prime}+t y^{\prime}-(t+3) y=0, \quad y(1)=2, \quad y^{\prime}(1)=1
$$
*Solution* *0<t<3*
*Example* Find the unique solution of the initial value problem
$$
y^{\prime \prime}+p(t) y^{\prime}+q(t) y=0, \quad y\left(t_{0}\right)=0, \quad y^{\prime}\left(t_{0}\right)=0
$$
*Solution* $y-0$, by the *Uniqueness Theorem* it's the only solution
**Exact Equations** The equation $P(x) y^{\prime \prime}+Q(x) y^{\prime}+R(x) y=0$ is said to be exact if it
can be written in the form $\left[P(x) y^{\prime}\right]^{\prime}+[f(x) y]^{\prime}=0$
$P^{\prime \prime}(x)-Q^{\prime}(x)+R(x)=0$
*Examples*
$y^{\prime \prime}+x y^{\prime}+y=0$
$y^{\prime \prime}+3 x^{2} y^{\prime}+x y=0$
$x y^{\prime \prime}-(\cos x) y^{\prime}+(\sin x) y=0$
$x^{2} y^{\prime \prime}+x y^{\prime}-y=0, \quad x>0$
**The Adjoint Equation** If a second order linear homogeneous equation is not exact, it can be made exact by multiplying by an appropriate integrating factor $\mu(x) .$ Thus we require that $\mu(x)$ be such that $\mu(x) P(x) y^{\prime \prime}+\mu(x) Q(x) y^{\prime}+\mu(x) R(x) y=0$ can be written in the form $\left[\mu(x) P(x) y^{\prime}\right]^{\prime}+[f(x) y]^{\prime}=0 .$ By equating coefficients in these two equations and eliminating $f(x),$ show that the function $\mu$ must satisfy
$$
P \mu^{\prime \prime}+\left(2 P^{\prime}-Q\right) \mu^{\prime}+\left(P^{\prime \prime}-Q^{\prime}+R\right) \mu=0
$$
The equation is known as the adjoint of the original equation
*Example* Find the adjoint equation of the following equations
$x^{2} y^{\prime \prime}+x y^{\prime}+\left(x^{2}-v^{2}\right) y=0, \quad$ Bessel's equation
$\left(1-x^{2}\right) y^{\prime \prime}-2 x y^{\prime}+\alpha(\alpha+1) y=0, \quad$ Legendre's equation
$y^{\prime \prime}-x y=0, \quad$ Airy's equation
**Self Adjoint** if the adjoint of the equation is the same as itself, for $P(x) y^{\prime \prime}+Q(x) y^{\prime}+R(x) y=0$, a necessary condition is that $P^{\prime}(x)=Q(x)$
For the second order linear equation $P(x) y^{\prime \prime}+Q(x) y^{\prime}+R(x) y=0,$ show that the adjoint
of the adjoint equation is the original equation.
### 3.3 Linear Independence and the Wronskian
**Linear Independent** If $f$ and $g$ are differentiable functions on an open interval $I$ and if $W(f, g)\left(t_{0}\right) \neq 0$ for some point $t_{0}$ in $I,$ then $f$ and $g$ are linearly independent on $I .$ Moreover, if $f$ and $g$ are linearly dependent on $I,$ then $W(f, g)(t)=0$ for every $t$ in $I .$
*Proof* consider the solution of
$\begin{aligned} k_{1} f\left(t_{0}\right)+k_{2} g\left(t_{0}\right) &=0 \\ k_{1} f^{\prime}\left(t_{0}\right)+k_{2} g^{\prime}\left(t_{0}\right) &=0 \end{aligned}$
**Abel's Theorem** If $y_1$ and $y_2$ are solutions of the differential equation
$$
L[y]=y^{\prime \prime}+p(t) y^{\prime}+q(t) y=0
$$
where $p$ and $q$ are continuous on an open interval $I,$ then the Wronskian $W\left(y_{1}, y_{2}\right)(t)$ is given by
$$
W\left(y_{1}, y_{2}\right)(t)=c \exp \left[-\int p(t) d t\right]
$$
where $c$ is a certain constant that depends on $y_{1}$ and $y_{2},$ but not on $t .$ Further, $W\left(y_{1}, y_{2}\right)(t)$ is either zero for all $t$ in $I(\text { if } c=0)$ or else is never zero in $I$ (if $c \neq 0 )$
*Proof*
$y_{1}^{\prime \prime}+p(t) y_{1}^{\prime}+q(t) y_{1}=0$
$y_{2}^{\prime \prime}+p(t) y_{2}^{\prime}+q(t) y_{2}=0$
$\left(y_{1} y_{2}^{\prime \prime}-y_{1}^{\prime \prime} y_{2}\right)+p(t)\left(y_{1} y_{2}^{\prime}-y_{1}^{\prime} y_{2}\right)=0$
$W^{\prime}+p(t) W=0$
### 3.4 Complex Roots of the Characteristic Equation
Euler's Formula
**Change of Variables**
*Examples*
$y^{\prime \prime}+t y^{\prime}+e^{-t^{2}} y=0, \quad-\infty<t<\infty$
$y^{\prime \prime}+3 t y^{\prime}+t^{2} y=0, \quad-\infty<t<\infty$
$t y^{\prime \prime}+\left(t^{2}-1\right) y^{\prime}+t^{3} y=0, \quad 0<t<\infty$
**Euler Equations** $t^{2} y^{\prime \prime}+\alpha t y^{\prime}+\beta y=0, \quad t>0$
*Solution* $x=\ln t$
*Examples*
$\begin{array}{ll}{t^{2} y^{\prime \prime}-3 t y^{\prime}+4 y=0,} & {t>0} \\ {t^{2} y^{\prime \prime}+2 t y^{\prime}+0.25 y=0,} & {t>0}\end{array}$
### 3.5 Repeated Roots; Reduction of Order
Suppose we know one solution $y_1(t)$, not everywhere zero, of
$$
y^{\prime \prime}+p(t) y^{\prime}+q(t) y=0
$$
To find a second solution, let
$$
y=v(t) y_{1}(t)
$$
Then we get
$$
y_{1} v^{\prime \prime}+\left(2 y_{1}^{\prime}+p y_{1}\right) v^{\prime}=0
$$
### 3.6 Nonhomogeneous Equations; Method of Undetermined Coefficients
| $g_{i}(t)$ | $Y_{i}(t)$ |
| ------------------------------------------------------------ | ------------------------------------------------------------ |
| $P_{n}(t)=a_{0} t^{n}+a_{1} t^{n-1}+\cdots+a_{n}$ | $t^{s}\left(A_{0} t^{n}+A_{1} t^{n-1}+\cdots+A_{n}\right)$ |
| $P_{n}(t) e^{\alpha t}$ | $t^{s}\left(A_{0} t^{n}+A_{1} t^{n-1}+\cdots+A_{n}\right) e^{\alpha t}$ |
| $P_{n}(t) e^{\alpha t}\left\{\begin{array}{l}{\sin \beta t} \\ {\cos \beta t}\end{array}\right.$ | $t^{s}\left[\left(A_{0} t^{n}+A_{1} t^{n-1}+\cdots+A_{n}\right) e^{\alpha t} \cos \beta t\right.\\+\left(B_{0} t^{n}+B_{1} t^{n-1}+\cdots+B_{n}\right) e^{\alpha t} \sin \beta t ]$ |
### 3.7 Variation of Parameters
*Example* Find a particular solution of
$$
y^{\prime \prime}+4 y=3 \csc t
$$
*Solution* $y=u_{1}(t) \cos 2 t+u_{2}(t) \sin 2 t$
$u_{1}^{\prime}(t) \cos 2 t+u_{2}^{\prime}(t) \sin 2 t=0$
$-2 u_{1}^{\prime}(t) \sin 2 t+2 u_{2}^{\prime}(t) \cos 2 t=3 \csc t$
$y=3 \sin t+\frac{3}{2} \ln |\csc t-\cot t| \sin 2 t+c_{1} \cos 2 t+c_{2} \sin 2 t$
**Variation of Parameters**
$$
y^{\prime \prime}+p(t) y^{\prime}+q(t) y=g(t)
\\y=u_{1}(t) y_{1}(t)+u_{2}(t) y_{2}(t)
$$
$$
u_{1}^{\prime}(t) y_{1}(t)+u_{2}^{\prime}(t) y_{2}(t)=0
\\u_{1}^{\prime}(t) y_{1}^{\prime}(t)+u_{2}^{\prime}(t) y_{2}^{\prime}(t)=g(t)
$$
$$
u_{1}^{\prime}(t)=-\frac{y_{2}(t) g(t)}{W\left(y_{1}, y_{2}\right)(t)}, \quad u_{2}^{\prime}(t)=\frac{y_{1}(t) g(t)}{W\left(y_{1}, y_{2}\right)(t)}Higher Order Linear Equations
$$
## Chapter 4 Higher Order Linear Equations
### General Theory of *n*th Order Linear Equations
**Theorem** If the functions $p_{1}, p_{2}, \ldots, p_{n},$ and $g$ are continuous on the open interval $I,$ then there exists exactly one solution $y=\phi(t)$ of the differential $L[y]=\frac{d^{n} y}{d t^{n}}+p_{1}(t) \frac{d^{n-1} y}{d t^{n-1}}+\cdots+p_{n-1}(t) \frac{d y}{d t}+p_{n}(t) y=g(t)$ that also satisfies the initial conditions below. This solution exists throughout the interval $I$
$y\left(t_{0}\right)=y_{0}, \quad y^{\prime}\left(t_{0}\right)=y_{0}^{\prime}, \quad \ldots, \quad y^{(n-1)}\left(t_{0}\right)=y_{0}^{(n-1)}$
Wronskian
$$
W\left(y_{1}, \ldots, y_{n}\right)=\left|\begin{array}{cccc}{y_{1}} & {y_{2}} & {\cdots} & {y_{n}} \\ {y_{1}^{\prime}} & {y_{2}^{\prime}} & {\cdots} & {y_{n}^{\prime}} \\ {\vdots} & {\vdots} & {} & {\vdots} \\ {y_{1}^{(n-1)}} & {y_{2}^{(n-1)}} & {\cdots} & {y_{n}^{(n-1)}}\end{array}\right|
$$
Characteristic Equations, Method of Undetermined Coefficient and Variable of Parameters are similar to Second Order Linear Equations
## Chapter 5 Series Solutions of Second Order Linear Equations
### 5.1 Review of Power Series
skip
### 5.2 Series Solutions near an Ordinary Point, part I
$$
P(x) \frac{d^{2} y}{d x^{2}}+Q(x) \frac{d y}{d x}+R(x) y=0
$$
*Ordinary Points* $Q/P$ and $R/P$ are analytical at $x_0$
*Singular Points* otherwise
*Example* Find a series solution of the equation
$$
y''+y=0\quad \quad -\infty<x<\infty
$$
*Recurrence Relation*
$$
(n+2)(n+1) a_{n+2}+a_{n}=0, \quad n=0,1,2,3, \ldots
$$
*Example* Find a series solution in powers of x of Airy’s equation
$$
y^{\prime \prime}-x y=0, \quad-\infty<x<\infty
$$
*Recurrence Relation*
$$
(n+2)(n+1) a_{n+2}=a_{n-1} \quad \text { for } \quad n=1,2,3, \ldots
$$
$$
y=a_{0}\left[1+\sum_{n=1}^{\infty} \frac{x^{3 n}}{2 \cdot 3 \cdots(3 n-1)(3 n)}\right]+\\a_{1}\left[x+\sum_{n=1}^{\infty} \frac{x^{3 n+1}}{3 \cdot 4 \cdots(3 n)(3 n+1)}\right]
$$
### 5.3 Series Solutions near an Ordinary Point, Part II
*Example* Find the radii of convergence of the Taylor series $\left(1+x^{2}\right)^{-1} \text { about } x=0$ and $\left(x^{2}-2 x+2\right)^{-1}$ about $x=0$ ?
*Answer* $1,\sqrt 2$
*Example* Determine a lower bound for the radius of convergence of series solutions of the
differential equation
$$
\left(1+x^{2}\right) y^{\prime \prime}+2 x y^{\prime}+4 x^{2} y=0
$$
about the point $x=0 ;$ about the point $x=-\frac{1}{2}$ .
*Solution* $|x|<1$, $\left|x+\frac{1}{2}\right|<\sqrt{5} / 2$
### 5.4 Regular Singular Points
**Regular Singular Points**
$\lim\limits _{x \rightarrow x_{0}}\left(x-x_{0}\right) \frac{Q(x)}{P(x)}$ is finite
$\lim\limits _{x \rightarrow x_{0}}\left(x-x_{0}\right)^{2} \frac{R(x)}{P(x)} \quad$ is finite.
### 5.5 Euler Equations
*Euler Equation* $L[y]=x^{2} y^{\prime \prime}+\alpha x y^{\prime}+\beta y=0$
$x=0$ is the regular singular point
$y=x^{r}$
$\begin{aligned} L\left[x^{r}\right] &=x^{2}\left(x^{r}\right)^{\prime \prime}+\alpha x\left(x^{r}\right)^{\prime}+\beta x^{r} \\ &=x^{r}[r(r-1)+\alpha r+\beta] \end{aligned}$
**Real Distinct Roots** $y=c_{1} x^{r_{1}}+c_{2} x^{r_{2}}$
*Example* Solve
$$
2 x^{2} y^{\prime \prime}+3 x y^{\prime}-y=0, \quad x>0
$$
*Solution* $y=c_{1} x^{1 / 2}+c_{2} x^{-1}, \quad x>0$
**Equal Roots** $y=\left(c_{1}+c_{2} \ln x\right) x^{r_{1}}, \quad x>0$
*Example* Solve
$$
x^{2} y^{\prime \prime}+5 x y^{\prime}+4 y=0, \quad x>0
$$
*Solution* $y=x^{-2}\left(c_{1}+c_{2} \ln x\right), \quad x>0$
**Complex Roots** $y=c_{1} x^{\lambda} \cos (\mu \ln x)+c_{2} x^{\lambda} \sin (\mu \ln x), \quad x>0$
*Example* Solve
$$
x^{2} y^{\prime \prime}+x y^{\prime}+y=0
$$
*Solution* $y=c_{1} \cos (\ln x)+c_{2} \sin (\ln x), \quad x>0$
For solutions $x<0$ , replace $x$ with *-x*
### 5.6 Series Solutions near a Regular Singular Point, Part I
*Example* Solve the differential equation
$$
2 x^{2} y^{\prime \prime}-x y^{\prime}+(1+x) y=0
$$
*Solution* Suppose $y=\sum_{n=0}^{\infty} a_{n} x^{r+n}$
**Indicial equation** $(r-1)(2 r-1)=0$
$$
\begin{aligned} a_{n} &=-\frac{a_{n-1}}{2(r+n)^{2}-3(r+n)+1} \\ &=-\frac{a_{n-1}}{[(r+n)-1][2(r+n)-1]}, \quad n \geq 1 \end{aligned}
$$
$$
y_{1}(x)=x\left[1+\sum_{n=1}^{\infty} \frac{(-1)^{n} x^{n}}{[3 \cdot 5 \cdot 7 \cdots(2 n+1)] n !}\right], \quad x>0
$$
$$
y_{2}(x)=x^{1 / 2}\left[1+\sum_{n=1}^{\infty} \frac{(-1)^{n} x^{n}}{n ![1 \cdot 3 \cdot 5 \cdots(2 n-1)]}\right], \quad x>0
$$
### 5.7 Series Solutions near a Regular Singular Point, Part II
$$
L[y]=x^{2} y^{\prime \prime}+x[x p(x)] y^{\prime}+\left[x^{2} q(x)\right] y=0
$$
$$
x p(x)=\sum_{n=0}^{\infty} p_{n} x^{n}, \quad x^{2} q(x)=\sum_{n=0}^{\infty} q_{n} x^{n}
$$
**indicial equation** $F(r)=r(r-1)+p_{0} r+q_{0}$
$$
p_{0}=\lim _{x \rightarrow 0} x p(x), \quad q_{0}=\lim _{x \rightarrow 0} x^{2} q(x)
$$
**exponents at the singularity** $r=r_1,r_2$
### 5.8 Bessel Equations
skip
## Chapter 6 The Laplace Transform
skip
## Chapter 7 Systems of First Order Linear Equations
### 7.4 Basic Theory of Systems of First Order Linear Equations
$$
\mathbf{x}^{\prime}=\mathbf{P}(t) \mathbf{x}+\mathbf{g}(t)
$$
*Wronskian* $W\left[\mathbf{x}^{(1)}, \ldots, \mathbf{x}^{(n)}\right](t)=\operatorname{det} \mathbf{X}(t)$
**Theorem** If $\mathbf{x}^{(1)}, \ldots, \mathbf{x}^{(n)}$are solutions on the interval $\alpha<t<\beta$, then in this interval $W \left[\mathbf{x}^{(1)}, \ldots, \mathbf{x}^{(n)}\right]$either is identically zero or else never vanishes.
### 7.5 Homogeneous Linear Systems with Constant Coefficients
$$
\mathbf{X}^{\prime}=\mathbf{A} \mathbf{X}
$$
Assume
$$
\mathbf{x}=\boldsymbol{\xi} e^{r t}
$$
$$
(\mathbf{A}-r \mathbf{I}) \boldsymbol{\xi}=\mathbf{0}
$$
*Example*
$$
\mathbf{x}^{\prime}=\left(\begin{array}{ll}{1} & {1} \\ {4} & {1}\end{array}\right) \mathbf{x}
$$
$r_1=3$, $r_2=-1$
$$
\mathbf{x}^{(1)}(t)=\left(\begin{array}{l}{1} \\ {2}\end{array}\right) e^{3 t}, \quad \mathbf{x}^{(2)}(t)=\left(\begin{array}{r}{1} \\ {-2}\end{array}\right) e^{-t}
$$
$$
W=-4e^{2t}
$$