Oregon State University

Calendar

Calendars

Event Details

Colloquium: First-order Optimization Methods for Large-scale Problems


Monday, January 8, 2018 4:00 PM - 4:50 PM

Hoi-To Wai, Postdoctoral Fellow
Arizona State University

Large-scale problems, where the problem dimension is high or the amount of data involved is huge, are often encountered in today's machine learning and information processing applications. This talk presents two recent first-order methods for large-scale problems under different settings.

In the first part, I will introduce a decentralized Frank-Wolfe (DeFW) algorithm that runs on a network of machines in a master-less setting. The algorithm is suitable for a scattered data scenario, where data are stored at agents/computers which cooperate to solve a common problem. The DeFW algorithm is "projection-free" such that it handles high dimensional constraints without the pain of projection, and thus resulting in a lower runtime complexity than the state-of-the-art projection-based methods. In terms of convergence rate, we show that the proposed method converges at least sublinearly for convex and non-convex (to a stationary point) optimization problems and the rate depends on the connectivity of the underlying network. As a sidetrack, I will also discuss about a simple online Frank-Wolfe algorithm for stochastic optimization.

In the second part, I will present a curvature-aided incremental aggregated gradient (CIAG) method for finite sum optimization. This algorithm is suitable for the "big-data" setting, where the objective function consists of $m$ component functions with $m \gg 1$. Such problem is tackled using an incremental optimization scheme, i.e., considering one component function at a time. Here, the novelty is to improve the gradient tracking performance with the aids of curvature / Hessian information. We show that for strongly convex problems, the CIAG method converges linearly at a rate that is comparable to evaluating one iteration for the full gradient method, and is $m$ times faster than the IAG method. Note that IAG is the deterministic counterpart of the popular SAG method.     

The efficacies of both methods are supported by a number of numerical experiments on synthetic and real data. This is a joint work with Jean Lafond (Telecom Paristech), Eric Moulines (Ecole Polytechnique), Wei Shi, Angelia Nedich and Anna Scaglione (ASU).

Additional information: http://eecs.oregonstate.edu/colloquium-series 


Linus Pauling Science Center (campus map)
125
1 541 737 3617
Sch Elect Engr/Comp Sci
This event appears on the following calendars: