On the rate of convergence of the proximal alternating linearized minimization algorithm for convex problems
Tóm tắt
We analyze the proximal alternating linearized minimization algorithm (PALM) for solving non-smooth convex minimization problems where the objective function is a sum of a smooth convex function and block separable non-smooth extended real-valued convex functions. We prove a global non-asymptotic sublinear rate of convergence for PALM. When the number of blocks is two, and the smooth coupling function is quadratic we present a fast version of PALM which is proven to share a global sublinear rate efficiency estimate improved by a squared root factor. Some numerical examples illustrate the potential benefits of the proposed schemes.
Tài liệu tham khảo
Auslender A (1971) Méthodes numériques pour la décomposition et la minimisation de fonctions non différentiables. Numer Math 18(3):213–223
Auslender A (1976) Optimisation: méthodes numériques. Masson, Paris
Beck A (2015) On the convergence of alternating minimization for convex programming with applications to iteratively reweighted least squares and decomposition schemes. SIAM J Optim 25(1):185–209
Beck A, Teboulle M (2009) A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J Imaging Sci 2(1):183–202
Beck A, Teboulle M (2010) Gradient-based algorithms with applications in signal recovery problems. In: Palomar D, Eldar Y (eds) Convex optimization in signal processing and communications. Cambribge University Press, London
Beck A, Tetruashvili L (2013) On the convergence of block coordinate descent type methods. SIAM J Optim 23(4):2037–2060
Bertsekas D (1999) Nonlinear programming, 2nd edn. Athena Scientific, Belmont
Bertsekas D, Tsitsiklis J (1989) Parallel and distributed computation: numerical methods. Prentice Hall, New Jersey
Bolte J, Sabach S, Teboulle M (2014) Proximal alternating linearized minimization for nonconvex and nonsmooth problems. Math Progr 146(1–2):459–494
Chambolle A, Pock T (2015) A remark on accelerated block coordinate descent for computing the proximity operators of sum of convex functions. Optimization Online: http://www.optimization-online.org/DB_FILE/2015/01/4719
Grippo L, Sciandrone M (2000) On the convergence of the block nonlinear Gauss–Seidel method under convex constraints. Oper Res Lett 26(3):127–136
Moreau J (1965) Proximité et dualité dans un espace hilbertien. Bull Soc Math France 93(2):273–299
Nesterov Y (2012) Efficiency of coordinate descent methods on huge-scale optimization problems. SIAM J Optim 22(2):341–362
Ortega JM, Rheinboldt WC (1970) Iterative solution of nonlinear equations in several variables, vol 30. SIAM, Philadelphia
Palomar D, Eldar Y (2010) Convex optimization in signal processing and communications. Cambridge University Press, London
Powell MJ (1973) On search directions for minimization algorithms. Math Progr 4(1):193–201
Sra S, Nowozin S, Wright SJ (2011) Optimization for machine learning. MIT Press, Cambridge
Tseng P (2001) Convergence of a block coordinate descent method for nondifferentiable minimization. J Optim Theory Appl 109(3):475–494
Wright SJ (2015) Coordinate descent methods. Mathematical programming series B. Special issue, international symposium on MP, Pittsburg, vol 151, pp 3–34