# Numerical optimizers for Logistic Regression

In this post I compar several implementations of Logistic Regression. The task was to implement a Logistic Regression model using standard optimization …

In this post I compar several implementations of Logistic Regression. The task was to implement a Logistic Regression model using standard optimization …

SciPy contains two methods to compute the singular value decomposition (SVD) of a matrix: `scipy.linalg.svd`

and `scipy.sparse.linalg.svds`

. In this post I'll compare both methods for the task of computing the full SVD of a large dense matrix.

The first method, `scipy.linalg.svd`

, is perhaps …

In scipy's development version there's a new function closely related to the QR-decomposition of a matrix and to the least-squares solution of a linear system. What this function does is to compute the QR-decomposition of a matrix and then multiply the resulting orthogonal factor by another arbitrary matrix. In pseudocode …

Ridge coefficients for multiple values of the regularization parameter
can be elegantly computed by updating the *thin* SVD decomposition of
the design matrix:

```
import numpy as np
from scipy import linalg
def ridge(A, b, alphas):
"""
Return coefficients for regularized least squares
min ||A x - b||^2 + alpha ||x||^2 …
```

**Update: a fast and stable norm was added to scipy.linalg in August
2011 and will be available in scipy 0.10** Last week I discussed with
Gael how we should compute the euclidean norm of a vector a using
SciPy. Two approaches suggest themselves, either calling
scipy.linalg.norm …