Compatibility with jax.scipy.optimize.minimize¤
The JAX API available at jax.scipy.optimize.minimize is being deprecated, in favour of domain-specific packages like Optimistix. As such Optimistix provides optimistix.compat.minimize as a drop-in replacement.
optimistix.compat.minimize(fun: Callable, x0: jax.Array, args: tuple = (), *, method: str, tol: float | None = None, options: Mapping[str, Any] | None = None) -> optimistix.compat.OptimizeResults
¤
Minimization of scalar function of one or more variables.
Info
This API is intended as a backward-compatibility drop-in for the now-deprecated
jax.scipy.optimize.minimize. In line with that API, only method="bfgs" is
supported.
Whilst it's the same basic algorithm, the Optimistix implementation may do slightly different things under-the-hood. You may obtain slightly different (but still correct) results.
Arguments:
fun: the objective function to be minimized,fun(x, *args) -> float, wherexis a 1-D array with shape(n,)andargsis a tuple of the fixed parameters needed to completely specify the function.funmust support differentiation.x0: initial guess. Array of real elements of size(n,), wherenis the number of independent variables.args: extra arguments passed to the objective function.method: solver type. Currently only"bfgs"is supported.tol: tolerance for termination.options: a dictionary of solver options. The following options are supported:maxiter(int): Maximum number of iterations to perform. Each iteration performs one function evaluation. Defaults to unlimited iterations.norm: (callablex -> float): the norm to use when calculating errors. Defaults to a max norm.
Returns:
An optimistix.compat.OptimizeResults object.
optimistix.compat.OptimizeResults(tuple)
¤
NamedTuple holding optimization results.
Attributes:
x: final solution.success:Trueif optimization succeeded.status: integer solver specific return code. 0 means converged (nominal), 1=max BFGS iters reached, 3=other failure.fun: final function value.jac: final jacobian array.hess_inv: final inverse Hessian estimate.nfev: integer number of function calls used.njev: integer number of gradient evaluations.nit: integer number of iterations of the optimization algorithm.