Skip to content

Compatibility with jax.scipy.optimize.minimize¤

The JAX API available at jax.scipy.optimize.minimize is being deprecated, in favour of domain-specific packages like Optimistix. As such Optimistix provides optimistix.compat.minimize as a drop-in replacement.

optimistix.compat.minimize(fun: Callable, x0: Array, args: tuple = (), *, method: str, tol: Optional[float] = None, options: Optional[Mapping[str, Any]] = None) -> OptimizeResults ¤

Minimization of scalar function of one or more variables.

Info

This API is intended as a backward-compatibility drop-in for the now-deprecated jax.scipy.optimize.minimize. In line with that API, only method="bfgs" is supported.

Whilst it's the same basic algorithm, the Optimistix implementation may do slightly different things under-the-hood. You may obtain slightly different (but still correct) results.

Arguments:

  • fun: the objective function to be minimized, fun(x, *args) -> float, where x is a 1-D array with shape (n,) and args is a tuple of the fixed parameters needed to completely specify the function. fun must support differentiation.
  • x0: initial guess. Array of real elements of size (n,), where n is the number of independent variables.
  • args: extra arguments passed to the objective function.
  • method: solver type. Currently only "bfgs" is supported.
  • tol: tolerance for termination.
  • options: a dictionary of solver options. The following options are supported:
    • maxiter (int): Maximum number of iterations to perform. Each iteration performs one function evaluation. Defaults to unlimited iterations.
    • norm: (callable x -> float): the norm to use when calculating errors. Defaults to a max norm.

Returns:

An optimistix.compat.OptimizeResults object.


optimistix.compat.OptimizeResults ¤

Object holding optimization results.

Attributes:

  • x: final solution.
  • success: True if optimization succeeded.
  • status: integer solver specific return code. 0 means converged (nominal), 1=max BFGS iters reached, 3=other failure.
  • fun: final function value.
  • jac: final jacobian array.
  • hess_inv: final inverse Hessian estimate.
  • nfev: integer number of function calls used.
  • njev: integer number of gradient evaluations.
  • nit: integer number of iterations of the optimization algorithm.