# Least squares¤

####
`optimistix.least_squares(fn: Union[Callable[[~Y, Any], tuple[~Out, ~Aux]], Callable[[~Y, Any], ~Out]], solver: Union[AbstractLeastSquaresSolver, AbstractMinimiser], y0: ~Y, args: PyTree[Any] = None, options: Optional[dict[str, Any]] = None, *, has_aux: bool = False, max_steps: Optional[int] = 256, adjoint: AbstractAdjoint = ImplicitAdjoint(linear_solver=AutoLinearSolver(well_posed=None)), throw: bool = True, tags: frozenset[object] = frozenset()) -> Solution[~Y, ~Aux]`

¤

Solve a nonlinear least-squares problem.

Given a nonlinear function `fn(y, args)`

which returns a pytree of residuals,
this returns the solution to \(\min_y \sum_i \textrm{fn}(y, \textrm{args})_i^2\).

**Arguments:**

`fn`

: The residual function. This should take two arguments:`fn(y, args)`

and return a pytree of arrays not necessarily of the same shape as the input`y`

.`solver`

: The least-squares solver to use. This can be either an`optimistix.AbstractLeastSquaresSolver`

solver, or an`optimistix.AbstractMinimiser`

. If`solver`

is an`optimistix.AbstractMinimiser`

, then it will attempt to minimise the scalar loss \(\min_y \sum_i \textrm{fn}(y, \textrm{args})_i^2\) directly.`y0`

: An initial guess for what`y`

may be.`args`

: Passed as the`args`

of`fn(y, args)`

.`options`

: Individual solvers may accept additional runtime arguments. See each individual solver's documentation for more details.`has_aux`

: If`True`

, then`fn`

may return a pair, where the first element is its function value, and the second is just auxiliary data. Keyword only argument.`max_steps`

: The maximum number of steps the solver can take. Keyword only argument.`adjoint`

: The adjoint method used to compute gradients through the fixed-point solve. Keyword only argument.`throw`

: How to report any failures. (E.g. an iterative solver running out of steps, or encountering divergent iterates.) If`True`

then a failure will raise an error. If`False`

then the returned solution object will have a`result`

field indicating whether any failures occured. (See`optimistix.Solution`

.) Keyword only argument.`tags`

: Lineax tags describing the any structure of the Hessian of`y -> sum(fn(y, args)**2)`

with respect to y. Used with`optimistix.ImplicitAdjoint`

to implement the implicit function theorem as efficiently as possible. Keyword only argument.

**Returns:**

An `optimistix.Solution`

object.

`optimistix.least_squares`

supports any of the following least squares solvers.

Info

In addition to the solvers listed here, any minimiser may also be used as the `solver`

. This is because a least squares problem \(\arg\min_\theta \sum_{i=1}^N r_i(\theta)^2\) is a special case of general minimisation problems. If you pass in a minimiser, then Optimistix will automatically treate your problem in this way.

`optimistix.AbstractLeastSquaresSolver`

####
```
optimistix.AbstractLeastSquaresSolver
```

¤

Abstract base class for all least squares solvers.

#####
`init(self, fn: Callable[[~Y, Any], tuple[~Out, ~Aux]], y: ~Y, args: PyTree, options: dict[str, Any], f_struct: PyTree[jax.ShapeDtypeStruct], aux_struct: PyTree[jax.ShapeDtypeStruct], tags: frozenset[object]) -> ~SolverState`

`abstractmethod`

¤

Perform all initial computation needed to initialise the solver state.

For example, the `optimistix.Chord`

method computes the Jacobian `df/dy`

with respect to the initial guess `y`

, and then uses it throughout the
computation.

**Arguments:**

`fn`

: The function to iterate over. This is expected to take two argumetns`fn(y, args)`

and return a pytree of arrays in the first element, and any auxiliary data in the second argument.`y`

: The value of`y`

at the current (first) iteration.`args`

: Passed as the`args`

of`fn(y, args)`

.`options`

: Individual solvers may accept additional runtime arguments. See each individual solver's documentation for more details.`f_struct`

: A pytree of`jax.ShapeDtypeStruct`

s of the same shape as the output of`fn`

. This is used to initialise any information in the state which may rely on the pytree structure, array shapes, or dtype of the output of`fn`

.`aux_struct`

: A pytree of`jax.ShapeDtypeStruct`

s of the same shape as the auxiliary data returned by`fn`

.`tags`

: exact meaning depends on whether this is a fixed point, root find, least squares, or minimisation problem; see their relevant entry points.

**Returns:**

A PyTree representing the initial state of the solver.

#####
`step(self, fn: Callable[[~Y, Any], tuple[~Out, ~Aux]], y: ~Y, args: PyTree, options: dict[str, Any], state: ~SolverState, tags: frozenset[object]) -> tuple[~Y, ~SolverState, ~Aux]`

`abstractmethod`

¤

Perform one step of the iterative solve.

**Arguments:**

`fn`

: The function to iterate over. This is expected to take two argumetns`fn(y, args)`

and return a pytree of arrays in the first element, and any auxiliary data in the second argument.`y`

: The value of`y`

at the current (first) iteration.`args`

: Passed as the`args`

of`fn(y, args)`

.`options`

: Individual solvers may accept additional runtime arguments. See each individual solver's documentation for more details.`state`

: A pytree representing the state of a solver. The shape of this pytree is solver-dependent.`tags`

: exact meaning depends on whether this is a fixed point, root find, least squares, or minimisation problem; see their relevant entry points.

**Returns:**

A 3-tuple containing the new `y`

value in the first element, the next solver
state in the second element, and the aux output of `fn(y, args)`

in the third
element.

#####
`terminate(self, fn: Callable[[~Y, Any], tuple[~Out, ~Aux]], y: ~Y, args: PyTree, options: dict[str, Any], state: ~SolverState, tags: frozenset[object]) -> tuple[Array, RESULTS]`

`abstractmethod`

¤

Determine whether or not to stop the iterative solve.

**Arguments:**

`fn`

: The function to iterate over. This is expected to take two argumetns`fn(y, args)`

and return a pytree of arrays in the first element, and any auxiliary data in the second argument.`y`

: The value of`y`

at the current iteration.`args`

: Passed as the`args`

of`fn(y, args)`

.`options`

: Individual solvers may accept additional runtime arguments. See each individual solver's documentation for more details.`state`

: A pytree representing the state of a solver. The shape of this pytree is solver-dependent.`tags`

: exact meaning depends on whether this is a fixed point, root find, least squares, or minimisation problem; see their relevant entry points.

**Returns:**

A 2-tuple containing a bool indicating whether or not to stop iterating in the
first element, and an `optimistix.RESULTS`

object in the second element.

#####
`postprocess(self, fn: Callable[[~Y, Any], tuple[~Out, ~Aux]], y: ~Y, aux: ~Aux, args: PyTree, options: dict[str, Any], state: ~SolverState, tags: frozenset[object], result: RESULTS) -> tuple[~Y, ~Aux, dict[str, Any]]`

`abstractmethod`

¤

Any final postprocessing to perform on the result of the solve.

**Arguments:**

`fn`

: The function to iterate over. This is expected to take two argumetns`fn(y, args)`

and return a pytree of arrays in the first element, and any auxiliary data in the second argument.`y`

: The value of`y`

at the last iteration.`aux`

: The auxiliary output at the last iteration.`args`

: Passed as the`args`

of`fn(y, args)`

.`options`

: Individual solvers may accept additional runtime arguments. See each individual solver's documentation for more details.`state`

: A pytree representing the final state of a solver. The shape of this pytree is solver-dependent.`tags`

: exact meaning depends on whether this is a fixed point, root find, least squares, or minimisation problem; see their relevant entry points.`result`

: as returned by the final call to`terminate`

.

**Returns:**

A 3-tuple of:

`final_y`

: the final`y`

to return as the solution of the solve.`final_aux`

: the final`aux`

to return as the auxiliary output of the solve.`stats`

: any additional information to place in the`sol.stats`

dictionary.

Info

Most solvers will not need to use this, so that this method may be defined as:

```
def postprocess(self, fn, y, aux, args, options, state, tags, result):
return y, aux, {}
```

#### ¤

##### ¤

##### ¤

##### ¤

##### ¤

`optimistix.AbstractGaussNewton`

####
```
optimistix.AbstractGaussNewton (AbstractLeastSquaresSolver)
```

¤

Abstract base class for all Gauss-Newton type methods.

This includes methods such as `optimistix.GaussNewton`

,
`optimistix.LevenbergMarquardt`

, and `optimistix.Dogleg`

.

Subclasses must provide the following attributes, with the following types:

`rtol`

:`float`

`atol`

:`float`

`norm`

:`Callable[[PyTree], Scalar]`

`descent`

:`AbstractDescent`

`search`

:`AbstractSearch`

`verbose`

:`frozenset[str]`

Supports the following `options`

:

`jac`

: whether to use forward- or reverse-mode autodifferentiation to compute the Jacobian. Can be either`"fwd"`

or`"bwd"`

. Defaults to`"fwd"`

, which is usually more efficient. Changing this can be useful when the target function has a`jax.custom_vjp`

, and so does not support forward-mode autodifferentiation.

#### ¤

####
```
optimistix.GaussNewton (AbstractGaussNewton)
```

¤

Gauss-Newton algorithm, for solving nonlinear least-squares problems.

Note that regularised approaches like `optimistix.LevenbergMarquardt`

are
usually preferred instead.

Supports the following `options`

:

`jac`

: whether to use forward- or reverse-mode autodifferentiation to compute the Jacobian. Can be either`"fwd"`

or`"bwd"`

. Defaults to`"fwd"`

, which is usually more efficient. Changing this can be useful when the target function has a`jax.custom_vjp`

, and so does not support forward-mode autodifferentiation.

#####
`__init__(self, rtol: float, atol: float, norm: Callable[[PyTree], Array] = <function max_norm>, linear_solver: AbstractLinearSolver = AutoLinearSolver(well_posed=None), verbose: frozenset[str] = frozenset())`

¤

**Arguments:**

`rtol`

: Relative tolerance for terminating the solve.`atol`

: Absolute tolerance for terminating the solve.`norm`

: The norm used to determine the difference between two iterates in the convergence criteria. Should be any function`PyTree -> Scalar`

. Optimistix includes three built-in norms:`optimistix.max_norm`

,`optimistix.rms_norm`

, and`optimistix.two_norm`

.`linear_solver`

: The linear solver used to compute the Newton step.`verbose`

: Whether to print out extra information about how the solve is proceeding. Should be a frozenset of strings, specifying what information to print out. Valid entries are`step`

,`loss`

,`accepted`

,`step_size`

,`y`

. For example`verbose=frozenset({"loss", "step_size"})`

.

####
```
optimistix.LevenbergMarquardt (AbstractGaussNewton)
```

¤

The Levenberg--Marquardt method.

This is a classical solver for nonlinear least squares, which works by regularising
`optimistix.GaussNewton`

with a damping factor. This serves to (a) interpolate
between Gauss--Newton and steepest descent, and (b) limit step size to a local
region around the current point.

This is a good algorithm for many least squares problems.

Supports the following `options`

:

`jac`

: whether to use forward- or reverse-mode autodifferentiation to compute the Jacobian. Can be either`"fwd"`

or`"bwd"`

. Defaults to`"fwd"`

, which is usually more efficient. Changing this can be useful when the target function has a`jax.custom_vjp`

, and so does not support forward-mode autodifferentiation.

#####
`__init__(self, rtol: float, atol: float, norm: Callable[[PyTree], Array] = <function max_norm>, linear_solver: AbstractLinearSolver = QR(), verbose: frozenset[str] = frozenset())`

¤

**Arguments:**

`rtol`

: Relative tolerance for terminating the solve.`atol`

: Absolute tolerance for terminating the solve.`norm`

: The norm used to determine the difference between two iterates in the convergence criteria. Should be any function`PyTree -> Scalar`

. Optimistix includes three built-in norms:`optimistix.max_norm`

,`optimistix.rms_norm`

, and`optimistix.two_norm`

.`linear_solver`

: The linear solver to use to solve the damped Newton step. Defaults to`lineax.QR`

.`verbose`

: Whether to print out extra information about how the solve is proceeding. Should be a frozenset of strings, specifying what information to print out. Valid entries are`step`

,`loss`

,`accepted`

,`step_size`

,`y`

. For example`verbose=frozenset({"loss", "step_size"})`

.

####
```
optimistix.IndirectLevenbergMarquardt (AbstractGaussNewton)
```

¤

The Levenberg--Marquardt method as a true trust-region method.

This is a variant of `optimistix.LevenbergMarquardt`

. The other algorithm works
by updating the damping factor directly -- this version instead updates a trust
region, and then fits the damping factor to the size of the trust region.

Generally speaking `optimistix.LevenbergMarquardt`

is preferred, as it performs
nearly the same algorithm, without the computational overhead of an extra (scalar)
nonlinear solve.

Supports the following `options`

:

`jac`

: whether to use forward- or reverse-mode autodifferentiation to compute the Jacobian. Can be either`"fwd"`

or`"bwd"`

. Defaults to`"fwd"`

, which is usually more efficient. Changing this can be useful when the target function has a`jax.custom_vjp`

, and so does not support forward-mode autodifferentiation.

#####
`__init__(self, rtol: float, atol: float, norm: Callable[[PyTree], Array] = <function max_norm>, lambda_0: Union[Array, ndarray, numpy.bool_, numpy.number, bool, int, float, complex] = 1.0, linear_solver: AbstractLinearSolver = AutoLinearSolver(well_posed=False), root_finder: AbstractRootFinder = Newton(rtol=0.01,atol=0.01,norm=<function max_norm>,kappa=0.01,linear_solver=AutoLinearSolver(well_posed=None),cauchy_termination=True), verbose: frozenset[str] = frozenset())`

¤

**Arguments:**

`rtol`

: Relative tolerance for terminating the solve.`atol`

: Absolute tolerance for terminating the solve.`norm`

: The norm used to determine the difference between two iterates in the convergence criteria. Should be any function`PyTree -> Scalar`

. Optimistix includes three built-in norms:`optimistix.max_norm`

,`optimistix.rms_norm`

, and`optimistix.two_norm`

.`lambda_0`

: The initial value of the Levenberg--Marquardt parameter used in the root- find to hit the trust-region radius. If`IndirectLevenbergMarquardt`

is failing, this value may need to be increased.`linear_solver`

: The linear solver used to compute the Newton step.`root_finder`

: The root finder used to find the Levenberg--Marquardt parameter which hits the trust-region radius.`verbose`

: Whether to print out extra information about how the solve is proceeding. Should be a frozenset of strings, specifying what information to print out. Valid entries are`step`

,`loss`

,`accepted`

,`step_size`

,`y`

. For example`verbose=frozenset({"loss", "step_size"})`

.

####
```
optimistix.Dogleg (AbstractGaussNewton)
```

¤

Dogleg algorithm. Used for nonlinear least squares problems.

Given a quadratic bowl that locally approximates the function to be minimised, then there are two different ways we might try to move downhill: in the steepest descent direction (as in gradient descent; this is also sometimes called the Cauchy direction), and in the direction of the minima of the quadratic bowl (as in Newton's method; correspondingly this is called the Newton direction).

The distinguishing feature of this algorithm is the "dog leg" shape of its descent path, in which it begins by moving in the steepest descent direction, and then switches to moving in the Newton direction.

Supports the following `options`

:

`jac`

: whether to use forward- or reverse-mode autodifferentiation to compute the Jacobian. Can be either`"fwd"`

or`"bwd"`

. Defaults to`"fwd"`

, which is usually more efficient. Changing this can be useful when the target function has a`jax.custom_vjp`

, and so does not support forward-mode autodifferentiation.

#####
`__init__(self, rtol: float, atol: float, norm: Callable[[PyTree], Array] = <function max_norm>, linear_solver: AbstractLinearSolver = AutoLinearSolver(well_posed=None), verbose: frozenset[str] = frozenset())`

¤

**Arguments:**

`rtol`

: Relative tolerance for terminating the solve.`atol`

: Absolute tolerance for terminating the solve.`norm`

: The norm used to determine the difference between two iterates in the convergence criteria. Should be any function`PyTree -> Scalar`

. Optimistix includes three built-in norms:`optimistix.max_norm`

,`optimistix.rms_norm`

, and`optimistix.two_norm`

.`linear_solver`

: The linear solver used to compute the Newton part of the dogleg step.`verbose`

: Whether to print out extra information about how the solve is proceeding. Should be a frozenset of strings, specifying what information to print out. Valid entries are`step`

,`loss`

,`accepted`

,`step_size`

,`y`

. For example`verbose=frozenset({"loss", "step_size"})`

.

####
```
optimistix.BestSoFarRootFinder (AbstractRootFinder)
```

¤

Wraps another root-finder, to return the best-so-far value. That is, it
makes a copy of the best `y`

seen, and returns that.

#####
`__init__(self, solver: AbstractRootFinder[~Y, ~Out, tuple[~Out, ~Aux], Any])`

¤

**Arguments:**

`solver`

: the root-finder solver to wrap.