A Distributed, Asynchronous, and Incremental Algorithm for Nonconvex Optimization: An ADMM Approach | IEEE Journals & Magazine | IEEE Xplore

A Distributed, Asynchronous, and Incremental Algorithm for Nonconvex Optimization: An ADMM Approach


Abstract:

The alternating direction method of multipliers (ADMM) has been popular for solving many signal processing problems, convex or nonconvex. In this paper, we study an async...Show More

Abstract:

The alternating direction method of multipliers (ADMM) has been popular for solving many signal processing problems, convex or nonconvex. In this paper, we study an asynchronous implementation of ADMM for solving a nonconvex nonsmooth optimization problem, whose objective is the sum of a number of component functions. The proposed algorithm allows the problem to be solved in a distributed, asynchronous, and incremental manner. First, the component functions can be distributed to different computing nodes, which perform the updates asynchronously without coordinating with each other. Two sources of asynchrony are covered by our algorithm: One is caused by the heterogeneity of the computational nodes and the other arises from unreliable communication links. Second, the algorithm can be viewed as implementing an incremental algorithm where at each step the (possibly delayed) gradients of only a subset of component functions are updated. We show that when certain bounds are imposed on the level of asynchrony, the proposed algorithm converges to the set of stationary solutions (resp. optimal solutions) for the nonconvex (resp. convex) problem, with a global sublinear rate.
Published in: IEEE Transactions on Control of Network Systems ( Volume: 5, Issue: 3, September 2018)
Page(s): 935 - 945
Date of Publication: 23 January 2017

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.