null
(Ed.)
We propose a new randomized algorithm for solving L2-regularized least-squares
problems based on sketching. We consider two of the most popular random embeddings,
namely, Gaussian embeddings and the Subsampled Randomized Hadamard
Transform (SRHT). While current randomized solvers for least-squares optimization
prescribe an embedding dimension at least greater than the data dimension, we
show that the embedding dimension can be reduced to the effective dimension of
the optimization problem, and still preserve high-probability convergence guarantees.
In this regard, we derive sharp matrix deviation inequalities over ellipsoids for
both Gaussian and SRHT embeddings. Specifically, we improve on the constant
of a classical Gaussian concentration bound whereas, for SRHT embeddings, our
deviation inequality involves a novel technical approach. Leveraging these bounds,
we are able to design a practical and adaptive algorithm which does not require
to know the effective dimension beforehand. Our method starts with an initial
embedding dimension equal to 1 and, over iterations, increases the embedding
dimension up to the effective one at most. Hence, our algorithm improves the
state-of-the-art computational complexity for solving regularized least-squares
problems. Further, we show numerically that it outperforms standard iterative
solvers such as the conjugate gradient method and its pre-conditioned version on
several standard machine learning datasets.
more »
« less