Abstract base classes¤
Optimistix is fully extendable. It provides a number of abstract base classes ("ABCs") which define the interfaces for custom solvers, custom line searches, etc.
- Custom minimisers may be created by subclassing
- Custom least squares solvers may be created by subclassing
- Custom root finders may be created by subclassing
- Custom fixed-point solvers may be created by subclassing
In each case, they offer a general way to combine a search and a descent.
Line searches, trust regions, learning rates etc.
Descent directions (steepest descent, Newton steps, Levenberg--Marquardt damped steps etc.)
These denote custom autodifferentiation strategies. These may be defined by subclassing
PyTree -> non-negative real scalar may be used as a norm. See also the norms page.
Nonlinear CG variants
(Y, Y, Y) -> scalar may be used to define a variant of nonlinear CG. See
optimistix.polak_ribiere as an example.