The effect of deterministic noise in subgradient methods
Tóm tắt
Từ khóa
Tài liệu tham khảo
Ben-Tal A., Margalit T., Nemirovski A.: The ordered subsets mirror descent optimization method and its use for the positron emission tomography reconstruction. SIAM J. Optim 12(1), 79–108 (2001)
Bertsekas D.P.: Nonlinear programming, 2nd edn. Athena Scientific, Belmont (1999)
Bertsekas D.P., Nedić A., Ozdaglar A.E.: Convex Analysis and Optimization. Athena Scientific, Belmont (2003)
Brännlund, U.: On relaxation methods for nonsmooth convex optimization. Doctoral Thesis, Royal Institute of Technology, Stockholm (1993)
Burke J.V., Ferris M.C.: Weak sharp minima in mathematical programming. SIAM J. Control Optim. 31(5), 1340–1359 (1993)
Dem’yanov V.F., Vasil’ev L.V.: Nondifferentiable Optimization. Optimization Software Inc., New York (1985)
Ermoliev Yu.M.: On the stochastic quasi-gradient method and stochastic quasi-feyer sequences. Kibernetika 2, 73–83 (1969)
Ermoliev Yu.M.: Stochastic Programming Methods. Nauka, Moscow (1976)
Ermoliev Yu.M.: Stochastic quasigradient methods and their application to system optimization. Stochastics 9, 1–36 (1983)
Ermoliev Yu.M.: Stochastic quasigradient methods. In: Ermoliev, Yu.M., Wets, R.J.-B.(eds) Numerical Techniques for Stochastic Optimization. IIASA, pp. 141–185. Springer, Heidelberg (1988)
Gaudioso M., Giallombardo G., Miglionico G.: An incremental method for solving convex finite min–max problems. Math. Oper. Res. 31(1), 173–187 (2006)
Goffin J.L., Kiwiel K.: Convergence of a simple subgradient level method. Math. Program. 85, 207–211 (1999)
Kibardin V.M.: Decomposition into functions in the minimization problem. Autom. Remote Control 40, 1311–1323 (1980)
Kiwiel K.C.: Convergence of approximate and incremental subgradient methods for convex optimization. SIAM J. Optim. 14(3), 807–840 (2004)
Kiwiel K.C.: A proximal bundle method with approximate subgradient linearizations. SIAM J. Optim. 16(4), 1007–1023 (2006)
Nedić, A., Bertsekas, D.P., Borkar, V.: Distributed asynchronous incremental subgradient methods. In: Butnariu, D., Censor Y., Reich, S. (eds.) Inherently Parallel Algorithms in Feasibility and Optimization and their Applications. Stud. Comput. Math., Elsevier, Amsterdam (2001)
Nedić A., Bertsekas D.P.: Convergence rate of incremental subgradient algorithm. In: Uryasev, S., Pardalos, P.M.(eds) Stochastic Optimization: Algorithms and Applications, pp. 263–304. Kluwer, Dordrecht (2000)
Nedić A., Bertsekas D.P.: Incremental subgradient methods for nondifferentiable optimization. SIAM J. Optim. 12, 109–138 (2001)
Nedić, A.: Subgradient Methods for Convex Optimization. Ph.D. Thesis, Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology (2002)
Nurminskii E.A.: Minimization of nondifferentiable functions in presence of noise. Kibernetika 10(4), 59–61 (1974)
Pang J.-S.: Error bounds in mathematical programming. Math. Program. Ser. B 79, 299–332 (1997)
Polyak B.T.: Nonlinear programming methods in the presence of noise. Math. Program. 1(4), 87–97 (1978)
Polyak B.T.: Introduction to Optimization. Optimization Software Inc., New York (1987)
Rabbat M.G., Nowak R.D.: Quantized incremental algorithms for distributed optimization. IEEE J. Select. Areas Commun. 23(4), 798–808 (2005)
Solodov M.V., Zavriev S.K.: Error stability properties of generalized gradient-type algorithms. J. Opt. Theory Appl. 98(3), 663–680 (1998)