Daniel P. Robinson

Effectively Using Negative Curvature in Optimization and Scalable Methods for Subspace Clustering

Abstract:
This talk will discuss two recent research topics.  The first topic considers the role played by negative curvature in the design of optimization algorithms.  The recent surge of interest in nonconvex modeling formulations (e.g., in deep learning, subspace clustering, and dictionary learning) emphasizes a need for a fresh look at nonconvex optimization algorithms with provable convergence guarantees.  A major factor in the design of such methods is the manner in which negative curvature is handled.  I present recent work that supports the following claims: (i) Commonly employed strategies for using negative curvature directions usually hurt algorithm performance; (ii) A new strategy based on upper-bounding models allows directions of negative curvature to be used while improving performance in the deterministic setting; and (iii) The strategy of using upper-bounding models is readily adapted to stochastic optimization, thus making it an attractive approach for large-scale “big data” problems. In the second half of the talk, I will discuss recent work on developing scalable optimization algorithms for performing the unsupervised task of clustering data drawn from a union of subspaces.  Data with such union-of-subspace structure arises in various applications such as face and image clustering, motion segmentation, hybrid system identification, and multiple view geometry.