class copt.utils.LogLoss(A, b, alpha=0.0)[source]

Logistic loss

\[-\frac{1}{n}\sum_{i=1}^n b_i \log(\sigma(a_i^T x)) + (1 - b_i) \log(1 - \sigma(a_i^T x))\]

where \(\sigma\) is the sigmoid function \(\sigma(t) = 1/(1 + e^{-t})\).

for a numerically stable computation of the logistic loss, we use the identities

\[ \begin{align}\begin{aligned}\begin{split}\log(\sigma(t)) = \begin{cases} -\log(1 + e^{-t}) &\text{ if $t \geq 0$}\\ t - \log(1 + e^t) &\text{ otherwise}\end{cases}\end{split}\\\begin{split}\log(1 - \sigma(t)) = \begin{cases} -t -\log(1 + e^{-t}) &\text{ if $t \geq 0$}\\ - \log(1 + e^t) &\text{ otherwise}\end{cases}\end{split}\end{aligned}\end{align} \]
__init__(A, b, alpha=0.0)[source]

Initialize self. See help(type(self)) for accurate signature.


__init__(A, b[, alpha]) Initialize self.
func_grad(x[, return_gradient])