Root finding¤
optimistix.root_find(fn: Union[Callable[[~Y, Any], tuple[~Out, ~Aux]], Callable[[~Y, Any], ~Out]], solver: Union[AbstractRootFinder, AbstractLeastSquaresSolver, AbstractMinimiser], y0: ~Y, args: PyTree = None, options: Optional[dict[str, Any]] = None, *, has_aux: bool = False, max_steps: Optional[int] = 256, adjoint: AbstractAdjoint = ImplicitAdjoint(linear_solver=AutoLinearSolver(well_posed=None)), throw: bool = True, tags: frozenset[object] = frozenset()) -> Solution[~Y, ~Aux]
¤
Solve a root-finding problem.
Given a nonlinear function fn(y, args)
which returns a pytree of arrays,
this returns the value z
such that fn(z, args) = 0
.
Arguments:
fn
: The function to find the roots of. This should take two arguments:fn(y, args)
and return a pytree of arrays not necessarily of the same shape as the inputy
.solver
: The root-finder to use. This should be anoptimistix.AbstractRootFinder
,optimistix.AbstractLeastSquaresSolver
, oroptimistix.AbstractMinimiser
. If it is a least-squares solver or a minimiser, then the valuesum(fn(y, args)^2)
is minimised.y0
: An initial guess for whaty
may be.args
: Passed as theargs
offn(y, args)
.options
: Individual solvers may accept additional runtime arguments. See each individual solver's documentation for more details.has_aux
: IfTrue
, thenfn
may return a pair, where the first element is its function value, and the second is just auxiliary data. Keyword only argument.max_steps
: The maximum number of steps the solver can take. Keyword only argument.adjoint
: The adjoint method used to compute gradients through the fixed-point solve. Keyword only argument.throw
: How to report any failures. (E.g. an iterative solver running out of steps, or encountering divergent iterates.) IfTrue
then a failure will raise an error. IfFalse
then the returned solution object will have aresult
field indicating whether any failures occured. (Seeoptimistix.Solution
.) Keyword only argument.tags
: Lineax tags describing the any structure of the Jacobian offn
with respect toy
. Used with some solvers (e.g.optimistix.Newton
), and with some adjoint methods (e.g.optimistix.ImplicitAdjoint
) to improve the efficiency of linear solves. Keyword only argument.
Returns:
An optimistix.Solution
object.
optimistix.root_find
supports any of the following root finders.
Info
In addition to the solvers listed here, any least squares solver or minimiser may also be used as the solver
. This is because finding the root x
for which f(x) = 0
can also be accomplished by finding the value x
for which sum(f(x)^2)
is minimised. If you pass in a least squares solver or minimiser, then Optimistix will automatically rewrite your problem to treat it in this way.
optimistix.AbstractRootFinder
optimistix.AbstractRootFinder
¤
Abstract base class for all root finders.
init(self, fn: Callable[[~Y, Any], tuple[~Out, ~Aux]], y: ~Y, args: PyTree, options: dict[str, Any], f_struct: PyTree[jax.ShapeDtypeStruct], aux_struct: PyTree[jax.ShapeDtypeStruct], tags: frozenset[object]) -> ~SolverState
abstractmethod
¤
Perform all initial computation needed to initialise the solver state.
For example, the optimistix.Chord
method computes the Jacobian df/dy
with respect to the initial guess y
, and then uses it throughout the
computation.
Arguments:
fn
: The function to iterate over. This is expected to take two argumetnsfn(y, args)
and return a pytree of arrays in the first element, and any auxiliary data in the second argument.y
: The value ofy
at the current (first) iteration.args
: Passed as theargs
offn(y, args)
.options
: Individual solvers may accept additional runtime arguments. See each individual solver's documentation for more details.f_struct
: A pytree ofjax.ShapeDtypeStruct
s of the same shape as the output offn
. This is used to initialise any information in the state which may rely on the pytree structure, array shapes, or dtype of the output offn
.aux_struct
: A pytree ofjax.ShapeDtypeStruct
s of the same shape as the auxiliary data returned byfn
.tags
: exact meaning depends on whether this is a fixed point, root find, least squares, or minimisation problem; see their relevant entry points.
Returns:
A PyTree representing the initial state of the solver.
step(self, fn: Callable[[~Y, Any], tuple[~Out, ~Aux]], y: ~Y, args: PyTree, options: dict[str, Any], state: ~SolverState, tags: frozenset[object]) -> tuple[~Y, ~SolverState, ~Aux]
abstractmethod
¤
Perform one step of the iterative solve.
Arguments:
fn
: The function to iterate over. This is expected to take two argumetnsfn(y, args)
and return a pytree of arrays in the first element, and any auxiliary data in the second argument.y
: The value ofy
at the current (first) iteration.args
: Passed as theargs
offn(y, args)
.options
: Individual solvers may accept additional runtime arguments. See each individual solver's documentation for more details.state
: A pytree representing the state of a solver. The shape of this pytree is solver-dependent.tags
: exact meaning depends on whether this is a fixed point, root find, least squares, or minimisation problem; see their relevant entry points.
Returns:
A 3-tuple containing the new y
value in the first element, the next solver
state in the second element, and the aux output of fn(y, args)
in the third
element.
terminate(self, fn: Callable[[~Y, Any], tuple[~Out, ~Aux]], y: ~Y, args: PyTree, options: dict[str, Any], state: ~SolverState, tags: frozenset[object]) -> tuple[Array, RESULTS]
abstractmethod
¤
Determine whether or not to stop the iterative solve.
Arguments:
fn
: The function to iterate over. This is expected to take two argumetnsfn(y, args)
and return a pytree of arrays in the first element, and any auxiliary data in the second argument.y
: The value ofy
at the current iteration.args
: Passed as theargs
offn(y, args)
.options
: Individual solvers may accept additional runtime arguments. See each individual solver's documentation for more details.state
: A pytree representing the state of a solver. The shape of this pytree is solver-dependent.tags
: exact meaning depends on whether this is a fixed point, root find, least squares, or minimisation problem; see their relevant entry points.
Returns:
A 2-tuple containing a bool indicating whether or not to stop iterating in the
first element, and an optimistix.RESULTS
object in the second element.
postprocess(self, fn: Callable[[~Y, Any], tuple[~Out, ~Aux]], y: ~Y, aux: ~Aux, args: PyTree, options: dict[str, Any], state: ~SolverState, tags: frozenset[object], result: RESULTS) -> tuple[~Y, ~Aux, dict[str, Any]]
abstractmethod
¤
Any final postprocessing to perform on the result of the solve.
Arguments:
fn
: The function to iterate over. This is expected to take two argumetnsfn(y, args)
and return a pytree of arrays in the first element, and any auxiliary data in the second argument.y
: The value ofy
at the last iteration.aux
: The auxiliary output at the last iteration.args
: Passed as theargs
offn(y, args)
.options
: Individual solvers may accept additional runtime arguments. See each individual solver's documentation for more details.state
: A pytree representing the final state of a solver. The shape of this pytree is solver-dependent.tags
: exact meaning depends on whether this is a fixed point, root find, least squares, or minimisation problem; see their relevant entry points.result
: as returned by the final call toterminate
.
Returns:
A 3-tuple of:
final_y
: the finaly
to return as the solution of the solve.final_aux
: the finalaux
to return as the auxiliary output of the solve.stats
: any additional information to place in thesol.stats
dictionary.
Info
Most solvers will not need to use this, so that this method may be defined as:
def postprocess(self, fn, y, aux, args, options, state, tags, result):
return y, aux, {}
¤
¤
¤
¤
¤
optimistix.Newton (AbstractRootFinder)
¤
Newton's method for root finding. Also sometimes known as Newton--Raphson.
Unlike the SciPy implementation of Newton's method, the Optimistix version also
works for vector-valued (or PyTree-valued) y
.
This solver optionally accepts the following options
:
lower
: The lower bound on the hypercube which contains the root. Should be a PyTree of arrays each broadcastable to the corresponding element ofy
. The iterates ofy
will be clipped to this hypercube.upper
: The upper bound on the hypercube which contains the root. Should be a PyTree of arrays each broadcastable to the corresponding element ofy
. The iterates ofy
will be clipped to this hypercube.
__init__(self, rtol: float, atol: float, norm: Callable[[PyTree], Array] = <function max_norm>, kappa: float = 0.01, linear_solver: AbstractLinearSolver = AutoLinearSolver(well_posed=None), cauchy_termination: bool = True)
¤
Arguments:
rtol
: Relative tolerance for terminating the solve.atol
: Absolute tolerance for terminating the solve.norm
: The norm used to determine the difference between two iterates in the convergence criteria. Should be any functionPyTree -> Scalar
. Optimistix includes three built-in norms:optimistix.max_norm
,optimistix.rms_norm
, andoptimistix.two_norm
.kappa
: A tolerance for early convergence check whencauchy_termination=False
.linear_solver
: The linear solver used to compute the Newton step.cauchy_termination
: WhenTrue
, use the Cauchy termination condition, that two adjacent iterates should have a small difference between them. This is usually the standard choice when solving general root finding problems. WhenFalse
, use a procedure which attempts to detect slow convergence, and quickly fail the solve if so. This is useful when iteratively performing the root-find, refining the target problem for those which fail. This comes up when solving differential equations with adaptive step sizing and implicit solvers. The exact procedure is as described in Section IV.8 of Hairer & Wanner, "Solving Ordinary Differential Equations II".
optimistix.Chord (AbstractRootFinder)
¤
The Chord method of root finding.
This is equivalent to the Newton method, except that the Jacobian is computed only
once at the initial point y0
, and then reused throughout the computation. This is
a useful way to cheapen the solve, if y0
is expected to be a good initial guess
and the target function does not change too rapidly. (For example this is the
standard technique used in implicit Runge--Kutta methods, when solving differential
equations.)
This solver optionally accepts the following options
:
lower
: The lower bound on the hypercube which contains the root. Should be a PyTree of arrays each broadcastable to the corresponding element ofy
. The iterates ofy
will be clipped to this hypercube.upper
: The upper bound on the hypercube which contains the root. Should be a PyTree of arrays each broadcastable to the corresponding element ofy
. The iterates ofy
will be clipped to this hypercube.
__init__(self, rtol: float, atol: float, norm: Callable[[PyTree], Array] = <function max_norm>, kappa: float = 0.01, linear_solver: AbstractLinearSolver = AutoLinearSolver(well_posed=None), cauchy_termination: bool = True)
¤
Arguments:
rtol
: Relative tolerance for terminating the solve.atol
: Absolute tolerance for terminating the solve.norm
: The norm used to determine the difference between two iterates in the convergence criteria. Should be any functionPyTree -> Scalar
. Optimistix includes three built-in norms:optimistix.max_norm
,optimistix.rms_norm
, andoptimistix.two_norm
.kappa
: A tolerance for early convergence check whencauchy_termination=False
.linear_solver
: The linear solver used to compute the Newton step.cauchy_termination
: WhenTrue
, use the Cauchy termination condition, that two adjacent iterates should have a small difference between them. This is usually the standard choice when solving general root finding problems. WhenFalse
, use a procedure which attempts to detect slow convergence, and quickly fail the solve if so. This is useful when iteratively performing the root-find, refining the target problem for those which fail. This comes up when solving differential equations with adaptive step sizing and implicit solvers. The exact procedure is as described in Section IV.8 of Hairer & Wanner, "Solving Ordinary Differential Equations II".
optimistix.Bisection (AbstractRootFinder)
¤
The bisection method of root finding. This may only be used with functions
R->R
, i.e. functions with scalar input and scalar output.
This requires the following options
:
lower
: The lower bound on the interval which contains the root.upper
: The upper bound on the interval which contains the root.
Which are passed as, for example,
optimistix.root_find(..., options=dict(lower=0, upper=1))
This algorithm works by considering the interval [lower, upper]
, checking the
sign of the evaluated function at the midpoint of the interval, and then keeping
whichever half contains the root. This is then repeated. The iteration stops once
the interval is sufficiently small.
__init__(self, rtol: float, atol: float, flip: Union[bool, Literal['detect']] = 'detect')
¤
Arguments:
rtol
: Relative tolerance for terminating solve.atol
: Absolute tolerance for terminating solve.flip
: Can be set to any of:False
: specify thatfn(lower, args) < 0 < fn(upper, args)
.True
: specify thatfn(lower, args) > 0 > fn(upper, args)
."detect"
: automatically checkfn(lower, args)
andfn(upper, args)
. Note that this option may increase both runtime and compilation time.
optimistix.BestSoFarRootFinder (AbstractRootFinder)
¤
Wraps another root-finder, to return the best-so-far value. That is, it
makes a copy of the best y
seen, and returns that.
__init__(self, solver: AbstractRootFinder[~Y, ~Out, tuple[~Out, ~Aux], Any])
¤
Arguments:
solver
: the root-finder solver to wrap.