# Material for 'Parameter estimation for ODE models'
The main notebook and data is given at:
[github.com/SteffenPL/SeminarParameterEstimation](https://github.com/SteffenPL/SeminarParameterEstimation).
We will consider the following models:
## _Lotka-Volterra Predator-Prey_ model
$$
\begin{aligned}
\frac{\mathrm{d} x}{\mathrm{d} t} &= \alpha x - \beta x y, \newline
\frac{\mathrm{d} y}{\mathrm{d} t} &= -\gamma y + \delta xy
\end{aligned}
$$
with $x(0) = 1, y(0) = 1$ for $t \in [0,10]$ and $\alpha = \frac{3}{2}, \beta = 1, \gamma = 3, \delta = 1$.
### Datasets
- `data_1.csv`
- `data_2.csv`
- `data_3.csv`
- `data_4.csv`
### Tasks:
- Try to estimate the correct parameters for all four datasets.
- Try different initial values, what do you notice?
- Plot a heatmap of the cost function.
Bonus:
- If you are interested in Bayesian inference, follow this tutorial:
https://turing.ml/dev/tutorials/10-bayesian-differential-equations/
## _FitzHugh–Nagumo_ neuron spike model
We consider the model
$$
\begin{aligned}
\frac{\mathrm{d} v}{\mathrm{d} t} &= \frac{1}{\varepsilon} ( f(v,\alpha) - w I_{\mathrm{app}} ), \newline
\frac{\mathrm{d} w}{\mathrm{d} t} &= -v + \gamma w
\end{aligned}
$$
with $v(0) = 1, w(0) = 0$ and $t \in [0,10]$. The parameters are $\varepsilon = 0.01$, $\alpha = 0.2, \gamma = 0.3$ and $I_{\mathrm{app}} = 0.8$ and
$$
f(v,\alpha) = v (1-v) (v - \alpha).
$$
### Datasets
- `spikes.csv` (real data)
### Tasks:
- Try first $L^2$ loss functions. What do you observe?
- Modify the code to keep the parameter $\varepsilon$ constant.
- Develop ideas for a new `loss` function to improve the results.
---
# Useful code snippets
## Defining an ODE model
```julia
using OrdinaryDiffEq, Plots
function rhs!(du, u, p, t)
du[1] = -p[1] * u[2]
du[2] = p[2] * u[1]
end
u0 = [1.0, 0.0]
tspan = [0.0, 10.0]
p = [1.0, 1.0]
prob = ODEProblem(rhs!, u0, tspan, p)
sol = solve(prob, Tsit5())
plot(sol)
```
## Loading a CSV dataset
```julia
using CSV, DataFrames
df = CSV.read("data_1.csv", DataFrame)
df.t # access column 't'
df.x # access column 'x'
df.y # access column 'y'
```
## Defining a $L^2$ cost function (without bounds)
```julia
using Optimization, OptimizationOptimJL
cost_function = build_loss_objective(prob, Tsit5(),
L2Loss(df.t, [df.x'; df.y']),
Optimization.AutoForwardDiff(),
maxiters = 10000)
p_init = [1.0, 1.0, 2.0, 1.0]
optprob = OptimizationProblem(cost_function,
p_init)
optsol = solve(optprob, BFGS())
```
## Defining a $L^2$ cost function (with bounds)
```julia
using Optimization, OptimizationOptimJL, OptimizationMultistartOptimization
cost_function = build_loss_objective(prob, Tsit5(),
L2Loss(df.t, [df.x'; df.y']),
Optimization.AutoForwardDiff(),
maxiters = 10000)
p_init = [1.0, 1.0, 2.0, 1.0]
ub = [3.0, 3.0, 4.0, 4.0]
lb = [0.0, 0.0, 0.0, 0.0]
optprob = OptimizationProblem(cost_function,
p_init, ub = ub, lb = lb)
optsol = solve(optprob, MultistartOptimization.TikTak(100), LBFGS())
```
## Plot heatmap of `cost_function`
Given a model with four parameters. Create a plot for different parameters $0 < p_1,p_2 < 2$ and fixed $p_3 = p_4 = 1$.
```julia
P1 = LinRange(0.0, 2.0, 60)
P2 = LinRange(0.0, 2.0, 60)
costs = [cost_function([p1, p2, 1.0, 1.0])) for p1 in P1, p2 in P2]
heatmap(P1, P2, costs)
title!("Cost of parameters α and σ")
```
## Define custom loss function
Here, we only measure the distance from the data to the first component of the solution `sol[1,:]`.
```julia
using LinearAlgebra
function compute_distance(sol, data)
return norm(sol[1,:] - data.x)
end
function loss(opt_params, data)
# solve ODE
ode_params = convert_to_ode_params(opt_params)
sol = solve(prob, Tsit5(), p = ode_params, saveat = data.t)
# check if solving the ODE was possible
if sol.retcode != ReturnCode.Success
return Inf64
end
# compute distance to data
return compute_distance(sol, data)
end
function convert_to_ode_params(opt_params)
ode_params = opt_params # or something else if needed
return ode_params
end
df = CSV.read("data.csv", DataFrame)
adtype = Optimization.AutoForwardDiff()
optfnc = Optimization.OptimizationFunction((x, p) -> loss(x, df), adtype)
p_init = [0.01, 0.01, 0.1]
optprob = Optimization.OptimizationProblem(optfnc, p_init,
lb = [0.0, 0.0, 0.0],
ub = [0.5, 0.5, 1.0])
optsol = solve(optprob, BFGS())
```
---
# Preparation for 'Parameter estimation for ODE models'
In this lecture, we will learn how to fit ODE models to data.
## Reading material
If you are interested in practical aspects of parameter optimisation, then I recommend this article:
- [Lessons Learned from Quantitative Dynamical Modeling in Systems Biology](https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0074335).
More in detail textbook is:
- [A Guide to Numerical Modelling in Systems Biology](https://link-springer-com.kyoto-u.idm.oclc.org/book/10.1007/978-3-319-20059-0) (see **Chapter 3, Sections 3.4 and 3.5**)
## Requirements for attending the course
1. You need a laptop with a working installation of **Julia** (or **Python**). I recommend to use Julia.
2. You should know how to solve simple ordinary differential questions with [DifferentialEquations.jl](https://docs.sciml.ai/DiffEqDocs/stable/getting_started/#Example-1-:-Solving-Scalar-Equations) (or [scipy.odeint](https://docs.scipy.org/doc/scipy/reference/generated/scipy.integrate.odeint.html)).
You can find installation instructions for under the follwing links:
- **Julia**: [Modern Julia (Installation Instructions)](https://modernjuliaworkflows.github.io/pages/writing/writing/#installation)
- Python: [Anaconda (Installation Instructions)](https://docs.anaconda.com/free/anaconda/install/index.html)
As an editor I recommend [Visual Studio Code](https://code.visualstudio.com/).
## Task
1. Please prepare yourself by simulating the famous _Lotka-Volterra Predator-Prey_ model
$$
\begin{aligned}
\frac{\mathrm{d} x}{\mathrm{d} t} &= \alpha x - \beta x y, \newline
\frac{\mathrm{d} y}{\mathrm{d} t} &= -\gamma y + \delta xy
\end{aligned}
$$
with $x(0) = 1, y(0) = 1$ for $t \in [0,10]$ and $\alpha = \frac{3}{2}, \beta = 1, \gamma = 3, \delta = 1$.
If you are looking for some hints: You can follow this tutorial: [SciML: Getting started](https://docs.sciml.ai/Overview/stable/getting_started/first_simulation/)
(or in python: [SciPy: Lotka-Volterra equations](https://scientific-python.readthedocs.io/en/latest/notebooks_rst/3_Ordinary_Differential_Equations/02_Examples/Lotka_Volterra_model.html)).
2. Please read or recall the **[Newton's method](https://en.wikipedia.org/wiki/Newton%27s_method)** for optimisation problems such as $\min f(x)$ for $f: \mathbb{R}^n \to \mathbb{R}$.