copt.fmin_CondatVu¶
-
copt.
fmin_CondatVu
(fun, fun_deriv, g_prox, h_prox, L, x0, alpha=1.0, beta=1.0, tol=1e-12, max_iter=10000, verbose=0, callback=None, step_size_x=0.001, step_size_y=1000.0, max_iter_ls=20, g_prox_args=(), h_prox_args=())[source]¶ Condat-Vu primal-dual splitting method.
This method for optimization problems of the form
minimize_x f(x) + alpha * g(x) + beta * h(L x)where f is a smooth function and g is a (possibly non-smooth) function for which the proximal operator is known.
Parameters: - fun (callable) – f(x) returns the value of f at x.
- fun_deriv (callable) – f_prime(x) returns the gradient of f.
- g_prox (callable of the form g_prox(x, alpha)) – g_prox(x, alpha) returns the proximal operator of g at x with parameter alpha.
- x0 (array-like) – Initial guess
- L (ndarray or sparse matrix) – Linear operator inside the h term.
- max_iter (int) – Maximum number of iterations.
- verbose (int) – Verbosity level, from 0 (no output) to 2 (output on each iteration)
- callback (callable) – callback function (optional).
Returns: res – The optimization result represented as a
scipy.optimize.OptimizeResult
object. Important attributes are:x
the solution array,success
a Boolean flag indicating if the optimizer exited successfully andmessage
which describes the cause of the termination. See scipy.optimize.OptimizeResult for a description of other attributes.Return type: OptimizeResult
References
Condat, Laurent. “A primal-dual splitting method for convex optimization involving Lipschitzian, proximable and linear composite terms.” Journal of Optimization Theory and Applications (2013).
Chambolle, Antonin, and Thomas Pock. “On the ergodic convergence rates of a first-order primal-dual algorithm.” Mathematical Programming (2015)