There is no commentSelect some text and then click Comment, or simply add a comment to this page from below to start a discussion.
Parameter finding prospective
(Modmesh) parameter finding: an application estimates the parameters of the defined functions on some numerical data. We use maximum likelihood method as an estimation, and finding maximum/minimum is an optimization problem.
Log-maximum likelihood method can be represented as minimized of the form
where is a data sample, and is parameter vector.
Code information
Parameters finding is part of modmesh. Parts of gradient calculator and iterator/stopping will be impleted by c++. Part of plot will be implemented by pyside6. The source code of modmesh is here. https://github.com/solvcon/modmesh
Drawing Plots of results
Iterator and stopping for minimization
Gradient calculator
Problem to solve
Optimization problem have many solutions and there is no universal solution can handle all situations. In our prospestive, gradient methods are adopted as they can cover many defined functions and expand the dimension of data.
As we used the defined PDFs, their derivatives (first order) is also analytically defined. For example, radioactive decay follows an exponential decay, that is . Its derivative value is .
Some gradient methods, such as Newton method and conjugate directions, will use second derivative matrix (hessian matrix) and its inverse matrix (covariance matrix).
Here are some gradient methods which are used in scipy.optimize.minimize
Unconstrained minimization
BFGS: quasi-Newton method of Broyden, Fletcher, Goldfarb, and Shanno (BFGS). It uses the first derivatives only. It is the default method. Ref. 3 p.136
CG: a nonlinear conjugate gradient algorithm. Ref 3 p.120-122
Newton-CG: the truncated Newton method. Suitable for large-scale problems. Ref. 3 p.168
trust-ncg: the Newton conjugate gradient trust-region algorithm for unconstrained minimization. This algorithm requires the gradient and either the Hessian or a function that computes the product of the Hessian with a given vector.
Bound-constrained minimization
L-BFGS-B: the L-BFGS-B algorithm. Default method. ref. Fortran 77 code, paper. I saw someone wrap the fortran code with c++ as here.
Powell: a modification of Powell’s method, which is a conjugate direction method. Ref. 4 Section 10.7. Direction Set (Powell's) Methods in Multidimensions
MIGRAD: Its strategy to find the local minimum is simply to vary the set of parameters , by small (variable-sized) steps, in a direction which causes to decrease until it finds the point from which increases in all allowed directions. Although not needed, if the numerical values of the derivative at any point are known, they can be provided to MIGRAD to help in the minimization. from Belle 2 document
Prospective Users
Stastical
System Architecture
c++ in the bottom Python API
For the coding side, I have little experiences about programming architecture.
Some objects need to implete as below
Defined functions:
Gaussian function
Exponential function
Polynomial
Chebyshev polynomial
Poisson function
Crystal ball function: The Crystal ball function is a Gaussian with a tail on the low side (or both side).
User defined function: it should be the same with scipy's user defined function
Stack up of above functions (or Call "ADD")
First derivative vector of defined functions
Hessian matrix of defined functions
Constrains (parameters bound): for example and for Gaussian function
Log-likelihood value
Minimization methods
Iterator
Stop
Linear programming for constrained minimization
Plots
However, I found a good project based on MINUIT and provide a python API as iminuithomepage.
MINUIT Tutorial: MINUIT is conceived as a tool to find the minimum value of a multi-parameter function and analyze the shape of the function around the minimum.
Numerical Optimization: book, Jorge Nocedal Stephen J. Wright, pdf online
Numerical Recipes: book, William H. Press, Saul A. Teukolsky, William T. Vetterling, Brian P. Flannery online book