Fixed points¤
optimistix.fixed_point(fn: Callable[[~Y, Any], tuple[~Y, ~Aux]] | Callable[[~Y, Any], ~Y], solver: optimistix.AbstractFixedPointSolver | optimistix.AbstractRootFinder | optimistix.AbstractLeastSquaresSolver | optimistix.AbstractMinimiser, y0: ~Y, args: PyTree[Any] = None, options: dict[str, Any] | None = None, *, has_aux: bool = False, max_steps: int | None = 256, adjoint: optimistix.AbstractAdjoint = optimistix.ImplicitAdjoint(), throw: bool = True, tags: frozenset[object] = frozenset()) -> optimistix.Solution[~Y, ~Aux]
¤
Find a fixed-point of a function.
Given a nonlinear function fn(y, args) which returns a pytree of arrays of the
same shape as y, this returns the value z such that fn(z, args) = z.
Arguments:
fn: The function to find the fixed-point of. This should take two argumentsfn(y, args), and return a pytree of arrays of the same shape as the inputy.solver: The root-finder to use. This can be either anoptimistix.AbstractFixedPointSolveroroptimistix.AbstractRootFinder, oroptimistix.AbstractLeastSquaresSolver, oroptimistix.AbstractMinimiser. Ifsolveris a root-finder then it will will attempt to find the root offn(y, args) - y. Ifsolveris a least-squares or minimisation algorithm, then it will attempt to minimisesum((fn(y, args) - y)^2).y0: An initial guess for whatymay be. Used to start the iterative process of finding the fixed point; using good initial guesses is often important.args: Passed as theargsoffn(y, args).options: Individual solvers may accept additional runtime arguments. See each individual solver's documentation for more details.has_aux: IfTrue, thenfnmay return a pair, where the first element is its function value, and the second is just auxiliary data. Keyword only argument.max_steps: The maximum number of steps the solver can take. Keyword only argument.adjoint: The adjoint method used to compute gradients through the fixed-point solve. Keyword only argument.throw: How to report any failures. (E.g. an iterative solver running out of steps, or encountering divergent iterates.) IfTruethen a failure will raise an error. IfFalsethen the returned solution object will have aresultfield indicating whether any failures occured. (Seeoptimistix.Solution.) Keyword only argument.tags: Lineax tags describing any structure of the Jacobian ofy -> fn(y, args) - ywith respect to y. (That is, the structure of the matrixdfn/dy - I.) Used withoptimistix.ImplicitAdjointto implement the implicit function theorem as efficiently as possible. Keyword only argument.
Returns:
An optimistix.Solution object.
optimistix.fixed_point supports any of the following fixed-point solvers.
Info
In addition to the solvers listed here, any root finder may also be used as the solver. This is because finding the fixed point x for which f(x) = x, can also be accomplished by finding the root x for which f(x) - x = 0. If you pass in a root finder, then Optimistix will automatically rewrite your problem to treat it in this way.
Likewise, any least squares solver or minimiser may also be used as the solver. This is because finding the root x for which f(x) = x can also be accomplished by finding the value x for which sum((f(x) - x)^2) is minimised. If you pass in a least squares solver or minimiser, then Optimistix will automatically rewrite your problem to treat it in this way.
optimistix.AbstractFixedPointSolver
optimistix.AbstractFixedPointSolver
¤
Abstract base class for all fixed point solvers.
init(fn: Callable[[~Y, Any], tuple[~Out, ~Aux]], y: ~Y, args: PyTree, options: dict[str, Any], f_struct: PyTree[jax.ShapeDtypeStruct], aux_struct: PyTree[jax.ShapeDtypeStruct], tags: frozenset[object]) -> ~SolverState
¤
Perform all initial computation needed to initialise the solver state.
For example, the optimistix.Chord method computes the Jacobian df/dy
with respect to the initial guess y, and then uses it throughout the
computation.
Arguments:
fn: The function to iterate over. This is expected to take two argumetnsfn(y, args)and return a pytree of arrays in the first element, and any auxiliary data in the second argument.y: The value ofyat the current (first) iteration.args: Passed as theargsoffn(y, args).options: Individual solvers may accept additional runtime arguments. See each individual solver's documentation for more details.f_struct: A pytree ofjax.ShapeDtypeStructs of the same shape as the output offn. This is used to initialise any information in the state which may rely on the pytree structure, array shapes, or dtype of the output offn.aux_struct: A pytree ofjax.ShapeDtypeStructs of the same shape as the auxiliary data returned byfn.tags: exact meaning depends on whether this is a fixed point, root find, least squares, or minimisation problem; see their relevant entry points.
Returns:
A PyTree representing the initial state of the solver.
step(fn: Callable[[~Y, Any], tuple[~Out, ~Aux]], y: ~Y, args: PyTree, options: dict[str, Any], state: ~SolverState, tags: frozenset[object]) -> tuple[~Y, ~SolverState, ~Aux]
¤
Perform one step of the iterative solve.
Arguments:
fn: The function to iterate over. This is expected to take two argumetnsfn(y, args)and return a pytree of arrays in the first element, and any auxiliary data in the second argument.y: The value ofyat the current (first) iteration.args: Passed as theargsoffn(y, args).options: Individual solvers may accept additional runtime arguments. See each individual solver's documentation for more details.state: A pytree representing the state of a solver. The shape of this pytree is solver-dependent.tags: exact meaning depends on whether this is a fixed point, root find, least squares, or minimisation problem; see their relevant entry points.
Returns:
A 3-tuple containing the new y value in the first element, the next solver
state in the second element, and the aux output of fn(y, args) in the third
element.
terminate(fn: Callable[[~Y, Any], tuple[~Out, ~Aux]], y: ~Y, args: PyTree, options: dict[str, Any], state: ~SolverState, tags: frozenset[object]) -> tuple[Bool[Array, ''], optimistix.RESULTS]
¤
Determine whether or not to stop the iterative solve.
Arguments:
fn: The function to iterate over. This is expected to take two argumetnsfn(y, args)and return a pytree of arrays in the first element, and any auxiliary data in the second argument.y: The value ofyat the current iteration.args: Passed as theargsoffn(y, args).options: Individual solvers may accept additional runtime arguments. See each individual solver's documentation for more details.state: A pytree representing the state of a solver. The shape of this pytree is solver-dependent.tags: exact meaning depends on whether this is a fixed point, root find, least squares, or minimisation problem; see their relevant entry points.
Returns:
A 2-tuple containing a bool indicating whether or not to stop iterating in the
first element, and an optimistix.RESULTS object in the second element.
postprocess(fn: Callable[[~Y, Any], tuple[~Out, ~Aux]], y: ~Y, aux: ~Aux, args: PyTree, options: dict[str, Any], state: ~SolverState, tags: frozenset[object], result: optimistix.RESULTS) -> tuple[~Y, ~Aux, dict[str, Any]]
¤
Any final postprocessing to perform on the result of the solve.
Arguments:
fn: The function to iterate over. This is expected to take two argumetnsfn(y, args)and return a pytree of arrays in the first element, and any auxiliary data in the second argument.y: The value ofyat the last iteration.aux: The auxiliary output at the last iteration.args: Passed as theargsoffn(y, args).options: Individual solvers may accept additional runtime arguments. See each individual solver's documentation for more details.state: A pytree representing the final state of a solver. The shape of this pytree is solver-dependent.tags: exact meaning depends on whether this is a fixed point, root find, least squares, or minimisation problem; see their relevant entry points.result: as returned by the final call toterminate.
Returns:
A 3-tuple of:
final_y: the finalyto return as the solution of the solve.final_aux: the finalauxto return as the auxiliary output of the solve.stats: any additional information to place in thesol.statsdictionary.
Info
Most solvers will not need to use this, so that this method may be defined as:
def postprocess(self, fn, y, aux, args, options, state, tags, result):
return y, aux, {}
optimistix.FixedPointIteration(optimistix.AbstractFixedPointSolver)
¤
Repeatedly calls a function in search of a fixed point.
This is one of the simplest ways to find a fixed point y of f: simply
repeatedly call y_{n+1}=f(y_n) until y_n stops changing.
Note that this is often not a very effective method, and root-finding algorithms are frequently preferred in practice.
__init__(rtol: float, atol: float, norm: Callable[[PyTree], Shaped[Array, '']] = <function max_norm>)
¤
Arguments:
rtol: Relative tolerance for terminating the solve.atol: Absolute tolerance for terminating the solve.norm: The norm used to determine the difference between two iterates in the convergence criteria. Should be any functionPyTree -> Scalar. Optimistix includes three built-in norms:optimistix.max_norm,optimistix.rms_norm, andoptimistix.two_norm.
optimistix.BestSoFarFixedPoint(optimistix.AbstractFixedPointSolver)
¤
Wraps another fixed-point solver, to return the best-so-far value. That is, it
makes a copy of the best y seen, and returns that.
__init__(solver: optimistix.AbstractFixedPointSolver[~Y, tuple[~Y, ~Aux], Any])
¤
Arguments:
solver: the fixed-point solver to wrap.