Skip to main content

event

Thursday 07 Jun 2012Sparse prediction, matrix completion and first-order optimization

Dr. Andreas Argyriou - Toyota technological Institute at Chicago

Harrison 170 15:00-16:00

We derive a novel norm, the k-support norm, which corresponds
to the tightest convex relaxation of sparsity combined with
an L2 penalty. We show that this new norm provides a tighter
relaxation to support recovery than the elastic net and
suggest using it as a replacement for the Lasso or the elastic
net in sparse prediction problems. We also present a first-order
algorithm for learning with this norm.

In addition, we propose a general-purpose algorithm for convex
optimization problems involving Lipschitz and nonsmooth functions.
This method requires only first-order computations, like gradients,
and hence can scale well to large data sets. We demonstrate its
efficiency and wide applicability on problems like compressed sensing,
matrix completion and robust PCA.

Visit website

Add to calendar

Add to calendar (.ics)