copt.minimize_PGD¶
-
copt.
minimize_PGD
(f_grad, x0, g_prox=None, tol=1e-06, max_iter=500, verbose=0, callback=None, line_search=True, step_size=None, max_iter_backtracking=1000, backtracking_factor=0.6)[source]¶ Proximal gradient descent.
Solves problems of the form
minimize_x f(x) + g(x)where we have access to the gradient of f and to the proximal operator of g.
Parameters: - f_grad – callable Returns the function value and gradient of the objective function.
- g – penalty term (proximal)
- x0 – array-like, optional Initial guess
- line_search – boolean Whether to perform backtracking (i.e. line-search) or not.
- max_iter – int Maximum number of iterations.
- verbose – int Verbosity level, from 0 (no output) to 2 (output on each iteration)
- step_size – float Starting value for the line-search procedure. XXX
- callback – callable callback function (optional).
Returns: - The optimization result represented as a
scipy.optimize.OptimizeResult
object. Important attributes are:x
the solution array,success
a Boolean flag indicating if the optimizer exited successfully andmessage
which describes the cause of the termination. See scipy.optimize.OptimizeResult for a description of other attributes.
Return type: res
References
Beck, Amir, and Marc Teboulle. “Gradient-based algorithms with applications to signal recovery.” Convex optimization in signal processing and communications (2009)