Try   HackMD

Parameter finding prospective

(Modmesh) parameter finding: an application estimates the parameters of the defined functions on some numerical data. We use maximum likelihood method as an estimation, and finding maximum/minimum is an optimization problem.

Log-maximum likelihood method can be represented as minimized of the form

F(θθ)=k=1Klnf(x=xk;θθ)
where
xk
is a data sample, and
θθ
is parameter vector.

Code information

Parameters finding is part of modmesh. Parts of gradient calculator and iterator/stopping will be impleted by c++. Part of plot will be implemented by pyside6.
The source code of modmesh is here.
https://github.com/solvcon/modmesh

  1. Drawing Plots of results
  2. Iterator and stopping for minimization
  3. Gradient calculator

Problem to solve

Optimization problem have many solutions and there is no universal solution can handle all situations. In our prospestive, gradient methods are adopted as they can cover many defined functions and expand the dimension of data.

As we used the defined PDFs, their derivatives (first order) is also analytically defined. For example, radioactive decay follows an exponential decay, that is

N(t)=aeλt. Its derivative value is
dN(t)dτ=ateλt
.

Some gradient methods, such as Newton method and conjugate directions, will use second derivative matrix

G (hessian matrix) and its inverse matrix
V
(covariance matrix).

Here are some gradient methods which are used in scipy.optimize.minimize

  • Unconstrained minimization

    • BFGS: quasi-Newton method of Broyden, Fletcher, Goldfarb, and Shanno (BFGS). It uses the first derivatives only. It is the default method. Ref. 3 p.136
    • CG: a nonlinear conjugate gradient algorithm. Ref 3 p.120-122
    • Newton-CG: the truncated Newton method. Suitable for large-scale problems. Ref. 3 p.168
    • trust-ncg: the Newton conjugate gradient trust-region algorithm for unconstrained minimization. This algorithm requires the gradient and either the Hessian or a function that computes the product of the Hessian with a given vector.
  • Bound-constrained minimization

    • L-BFGS-B: the L-BFGS-B algorithm. Default method. ref. Fortran 77 code, paper. I saw someone wrap the fortran code with c++ as here.
    • Powell: a modification of Powell’s method, which is a conjugate direction method. Ref. 4 Section 10.7. Direction Set (Powell's) Methods in Multidimensions
  • MIGRAD: Its strategy to find the local minimum is simply to vary the set of parameters

    p, by small (variable-sized) steps, in a direction which causes
    F
    to decrease until it finds the point
    p^
    from which
    F
    increases in all allowed directions. Although not needed, if the numerical values of the derivative
    F(p)p
    at any point
    p
    are known, they can be provided to MIGRAD to help in the minimization. from Belle 2 document

Prospective Users

Stastical

System Architecture

c++ in the bottom
Python API

For the coding side, I have little experiences about programming architecture.

Some objects need to implete as below

  • Defined functions:
    • Gaussian function
    • Exponential function
    • Polynomial
    • Chebyshev polynomial
    • Poisson function
    • Crystal ball function: The Crystal ball function is a Gaussian with a tail on the low side (or both side).
    • User defined function: it should be the same with scipy's user defined function
    • Stack up of above functions (or Call "ADD")
  • First derivative vector of defined functions
  • Hessian matrix of defined functions
  • Constrains (parameters bound): for example
    0<μ<20.0
    and
    0.5<σ<3.0
    for Gaussian function
  • Log-likelihood value
  • Minimization methods
    • Iterator
    • Stop
    • Linear programming for constrained minimization
  • Plots

However, I found a good project based on MINUIT and provide a python API as iminuit homepage.

Extension

  • Chi-square methods

Reference

  1. scipy.optimize.minimize: Minimization of scalar function of one or more variables.
  2. MINUIT Tutorial: MINUIT is conceived as a tool to find the minimum value of a multi-parameter function and analyze the shape of the function around the minimum.
  3. Numerical Optimization: book, Jorge Nocedal Stephen J. Wright, pdf online
  4. Numerical Recipes: book, William H. Press, Saul A. Teukolsky, William T. Vetterling, Brian P. Flannery online book