User menu

Random block coordinate descent methods for linearly constrained optimization over networks

Bibliographic reference Necoara, Ion ; Nesterov, Yurii ; Glineur, François. Random block coordinate descent methods for linearly constrained optimization over networks. In: Journal of Optimization Theory and Applications, Vol. 173, no. 1, p. 227-254 (2017)
Permanent URL
  1. Necoara Ion, Random Coordinate Descent Algorithms for Multi-Agent Convex Optimization Over Networks, 10.1109/tac.2013.2250071
  2. Xiao L., Boyd S., Optimal Scaling of a Gradient Method for Distributed Resource Allocation, 10.1007/s10957-006-9080-1
  3. Ishii Hideaki, Tempo Roberto, Bai Er-Wei, A Web Aggregation Approach for Distributed Randomized PageRank Algorithms, 10.1109/tac.2012.2190161
  4. Keyou You, Lihua Xie, Network Topology and Communication Data Rate for Consensusability of Discrete-Time Multi-Agent Systems, 10.1109/tac.2011.2164017
  5. Bauschke Heinz H., Borwein Jonathan M., On Projection Algorithms for Solving Convex Feasibility Problems, 10.1137/s0036144593251710
  6. Combettes, P.: The convex feasibility problem in image recovery. In: Hawkes, P. (ed.) Advances in Imaging and Electron Physics, pp. 155–270. Academic Press, Cambridge (1996)
  7. Wright Stephen J., Accelerated Block-coordinate Relaxation for Regularized Optimization, 10.1137/100808563
  8. Liu Ji, Wright Stephen J., Asynchronous Stochastic Coordinate Descent: Parallelism and Convergence Properties, 10.1137/140961134
  9. Qin Zhiwei, Scheinberg Katya, Goldfarb Donald, Efficient block-coordinate descent algorithms for the Group Lasso, 10.1007/s12532-013-0051-x
  10. Richtárik Peter, Takáč Martin, Parallel coordinate descent methods for big data optimization, 10.1007/s10107-015-0901-6
  11. Beck Amir, Tetruashvili Luba, On the Convergence of Block Coordinate Descent Type Methods, 10.1137/120887679
  12. Tseng P., Yun S., Block-Coordinate Gradient Descent Method for Linearly Constrained Nonsmooth Separable Optimization, 10.1007/s10957-008-9458-3
  13. Nesterov Yu., Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems, 10.1137/100802001
  14. Patrascu Andrei, Necoara Ion, Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization, 10.1007/s10898-014-0151-9
  15. Richtárik Peter, Takáč Martin, Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function, 10.1007/s10107-012-0614-z
  16. Necoara Ion, Clipici Dragos, Parallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error Bounds, 10.1137/130950288
  17. Beck Amir, The 2-Coordinate Descent Method for Solving Double-Sided Simplex Constrained Minimization Problems, 10.1007/s10957-013-0491-5
  18. Necoara Ion, Patrascu Andrei, A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints, 10.1007/s10589-013-9598-8
  19. Reddi, S., Hefny, A., Downey, C., Dubey, A., Sra, S.: Large-scale randomized-coordinate descent methods with non-separable linear constraints. Tech. rep. (2014).
  20. Necoara, I., Nesterov, Y., Glineur, F.: A random coordinate descent method on large optimization problems with linear constraints. Tech. rep. (2011).
  21. Hong Mingyi, Luo Zhi-Quan, On the linear convergence of the alternating direction method of multipliers, 10.1007/s10107-016-1034-2
  22. Wei Ermin, Ozdaglar Asuman, Jadbabaie Ali, A Distributed Newton Method for Network Utility Maximization—Part II: Convergence, 10.1109/tac.2013.2253223
  23. Nesterov Yurii, Introductory Lectures on Convex Optimization, ISBN:9781461346913, 10.1007/978-1-4419-8853-9
  24. Godsil Chris, Royle Gordon, Algebraic Graph Theory, ISBN:9780387952208, 10.1007/978-1-4613-0163-9