copt.minimize_APGD

copt.minimize_APGD(f_grad, x0, g_prox=None, tol=1e-06, max_iter=500, verbose=0, callback=None, line_search=True, step_size=None, max_iter_backtracking=100, backtracking_factor=0.6)[source]

Accelerated proximal gradient descent.

Solves problems of the form

minimize_x f(x) + alpha g(x)

where we have access to the gradient of f and to the proximal operator of g.

Parameters:
  • f_grad – loss function, differentiable
  • g_prox – penalty, proximable
  • g_prox – g_prox(x, alpha) returns the proximal operator of g at x with parameter alpha.
  • x0 – array-like Initial guess
  • line_search – boolean Whether to perform backtracking (i.e. line-search) or not.
  • max_iter – int Maximum number of iterations.
  • verbose – int Verbosity level, from 0 (no output) to 2 (output on each iteration)
  • step_size – float Starting value for the line-search procedure. XXX
  • callback – callable callback function (optional).
Returns:

The optimization result represented as a

scipy.optimize.OptimizeResult object. Important attributes are: x the solution array, success a Boolean flag indicating if the optimizer exited successfully and message which describes the cause of the termination. See scipy.optimize.OptimizeResult for a description of other attributes.

Return type:

res

References

Amir Beck and Marc Teboulle. “Gradient-based algorithms with applications to signal recovery.” Convex optimization in signal processing and communications (2009)