copt.fmin_PGD¶
-
copt.
fmin_PGD
(f, g=None, x0=None, tol=1e-12, max_iter=100, verbose=0, callback=None, backtracking=True, step_size=None, max_iter_backtracking=100, backtracking_factor=0.6, trace=False)[source]¶ Proximal gradient descent.
Solves problems of the form
minimize_x f(x) + g(x)where we have access to the gradient of f and to the proximal operator of g.
Parameters: - f – loss function (smooth)
- g – penalty term (proximable)
- x0 – array-like, optional Initial guess
- backtracking (
bool
) – boolean Whether to perform backtracking (i.e. line-search) or not. - max_iter – int Maximum number of iterations.
- verbose – int Verbosity level, from 0 (no output) to 2 (output on each iteration)
- step_size – float Starting value for the line-search procedure. XXX
- callback – callable callback function (optional).
Return type: OptimizeResult
Returns: - The optimization result represented as a
scipy.optimize.OptimizeResult
object. Important attributes are:x
the solution array,success
a Boolean flag indicating if the optimizer exited successfully andmessage
which describes the cause of the termination. See scipy.optimize.OptimizeResult for a description of other attributes.
Return type: res
References
Beck, Amir, and Marc Teboulle. “Gradient-based algorithms with applications to signal recovery.” Convex optimization in signal processing and communications (2009)