# Notes on equilibrium and stability of the dynamical systems.
#### Author: [Sharath](https://sharathraparthy.github.io/)
## What is a dynamical system?
It is any system that evolves and changes through time governed by a set of rules. Using dynamical systems we can study the long term behavior of an evolving system.
Formally, it is a triplet $(X, T, \phi)$ where $X$ denotes the state space, $T$ denotes the time space and $\phi: X \times T \rightarrow X$ is the flow (this is the rule that governs the evolution).
There are few properties of flow:
1. $\phi(X, 0) = X$
1. Principle of compositionality: $\phi(\phi(x, t), s) = \phi(x, t+s)$
There are two kinds of system evolution we can think of depending on the temporal domain; discrete or continuous.
**Discrete-time dynamical system:** A discrete-time dynamical system consists of a non-empty set $X$ and a map $f: X \rightarrow X$. For $n \in \mathbb{N}$, the $n^{th}$ iterate of f is the $n$-fold composition $f^n = f \odot \cdot \cdot \cdot \odot f$
**Continuous-time dynamical system:**
In this formulation, we have a density of possible times that is basically given by real numbers $\mathbb{R}$. The solutions are basically solid lines (as between any two points we can find a point or time $t \in \mathbb{R}$).
## Notion of equilibrium
For the rest of the notes, we will only consider the continuous time dynamics but however we can port the results to discrete case as well.
Let's consider the following ODE;
$$
\dot{x} = f(x), \hspace{5mm} x \in \mathbb{R^n}
$$
Solving this equation would give us a dynamical system whose flow is $\phi(x, t)$.
We will define a equilibrium solution or a fixed point as a "root" of the vector field $f(\tilde{x}) = 0$. This implies that the derivative at $\tilde{x}$ is zero which inturn means there is no motion.
## Stability of the system
There is a notion of stability when we talk about the dynamical systems. This in general says that "if we start close enough, we will stay close enough."
There is also a notion of *asymptotic stability* which is quite strong. The intuition is "if wet start close enough then you will be attracted to $\tilde{x}$.
Now we will generalize and formalize these ideas to general solutions $\tilde{x}(t)$.
**Lyapunov stability**: $\tilde{x}(t)$ is said to be stable (or lyapunov stable) if, given $\epsilon > 0$ there exists a $\delta = \delta(\epsilon) > 0$ such that for any other solution $y(t)$ satisfying $|\tilde{x}(t_0) - y(t_0)| < \delta$, then $|\tilde{x}(t) - y(t)| < \epsilon$ for $t > t_0 \in \mathbb{R}$
This definition basically says that if we start within the $\delta$ distance of the initial condition of $t$ at $t_0$ ($|\tilde{x}(t_0) - y(t_0)| < \delta$), then we will remain, throughout the $t > t_0$, in the $\epsilon$ tube or $\epsilon$ neighborhood ($|\tilde{x}(t) - y(t)| < \epsilon$) of the solution. This is shown in the following picture.
![](https://i.imgur.com/vngtn4r.png)
Now let's also try to define asymptotic stability formally.
**Asymptotic stability**: $\tilde{x}(t)$ is said to be asymptotically stable, if it is lyapunov stable and for any other solution $y(t)$, there exists a constant $b > 0$ such that $|\tilde{x}(t_0) - y(t_0)| < b$, then $\lim _{t \rightarrow \infty}|\tilde{x}(t) - y(t)| = 0$
![](https://i.imgur.com/t6yMGui.png)
This intuitively means that, if we start within the $b$ neighborhood, then in the limit the distance between $\tilde{x}(t)$ and $y(t)$ shrinks to $0$. Here, if we observe carefully, we require the solution to be lyapunov stable. The reason is that there are some weird solutions where we can go infinitely far away before actually converging asymptotically. These are called "homoclinic orbits" and for such cases it's difficult to guarantee the boundedness of the solution even though the condition $\lim _{t \rightarrow \infty}|\tilde{x}(t) - y(t)| = 0$ is true. And hence for asymptotic stability, we also require lyapunov stable solutions.
## Orbital stability v/s trajectory stability
In dynamical systems, when we talk about stability, we often talk either in terms of orbits or trajectories. The distinction is subtle but important. So, let's try to define these:
1. **Trajectory:** A trajectory is a solution to a differential equation and is "parameterized" by time $t$.
1. **Orbit:** An orbit is the set of all points that are visited by a trajectory. $O(x_0) = \{x \in X \mid x = x(x_0, t) \forall \in T \}$. This is not parameterized.
Whatever notion of stabilities discussed above are defined keeping trajectories in mind. This is because comparision is made at given time $t$ and we check if the norm $|\tilde{x}(t) - y(t)|$ either lies in the epsilon neighborhood or asymptotically converges to zero. The notion of orbital stability is widely used in the scenarios where the time is "stretched" or "squeezed". The notion of orbital stability is more applicable here are we remove the dependence on $t$ in this case. In other words, if we don't case about when we reach particular point in time, then orbital stability makes more sense. Let's formally define that.
For defining the orbital stability, we need to defined the "distance" with respect to a set.
**Distance to a set:** We define the distance between a point $p \in X$ and a set $S \subset X$ by $d(p, S) = \inf_{x \in S} |p - x|$
This is basically the smallest distance between the point $p$ and any point in the set $S$
**Orbital stability:** Given a trajectory $x(t)$, it is said to be orbitally stable if, given $\epsilon > 0$, there exists $\delta = \delta(\epsilon) > 0$ such that for any other solution $y(t)$ satisfying $|\tilde{x}(t_0) - y(t_0)| < \delta$, then $d(y(t), O^{+}(x_0, t_0)) < \epsilon$ for $t > t_0$
We can define asymptotic orbital stability similar to the asymptotic stability where the "set" distance asymptotically reaches zero.
Trajectory stability is a stronger requirement whereas the orbital stability is a weak requirement for stability.
<!-- ## Determining the stability -->