Abstract:
The paper is devoted to new modifications of recently proposed adaptive Mirror Descent methods for convex minimization problems in the case of several convex functional constraints. Methods for problems of two types are considered. In problems of the first type, the objective functional is Lipschitz (generally, nonsmooth). In problems of the second type, the gradient of the objective functional is Lipschitz. We also consider the case of a nonsmooth objective functional equal to the maximum of smooth functionals with Lipschitz gradient. In all the cases, the functional constraints are assumed to be Lipschitz and, generally, nonsmooth. The proposed modifications make it possible to reduce the running time of the algorithm due to skipping some of the functional constraints at nonproductive steps. We derive bounds for the convergence rate, which show that the methods under consideration are optimal from the viewpoint of lower oracle estimates. The results of numerical experiments illustrating the advantages of the proposed procedure for some examples are presented.
Citation:
F. S. Stonyakin, M. S. Alkousa, A. N. Stepanov, M. A. Barinov, “Adaptive mirror descent algorithms in convex programming problems with Lipschitz constraints”, Trudy Inst. Mat. i Mekh. UrO RAN, 24, no. 2, 2018, 266–279
\Bibitem{StoAlkSte18}
\by F.~S.~Stonyakin, M.~S.~Alkousa, A.~N.~Stepanov, M.~A.~Barinov
\paper Adaptive mirror descent algorithms in convex programming problems with Lipschitz constraints
\serial Trudy Inst. Mat. i Mekh. UrO RAN
\yr 2018
\vol 24
\issue 2
\pages 266--279
\mathnet{http://mi.mathnet.ru/timm1541}
\crossref{https://doi.org/10.21538/0134-4889-2018-24-2-266-279}
\elib{https://elibrary.ru/item.asp?id=35060696}
Linking options:
https://www.mathnet.ru/eng/timm1541
https://www.mathnet.ru/eng/timm/v24/i2/p266
This publication is cited in the following 10 articles:
S. S. Ablaev, A. N. Beznosikov, A. V. Gasnikov, D. M. Dvinskikh, A. V. Lobanov, S. M. Puchinin, F. S. Stonyakin, “On Some Works of Boris Teodorovich Polyak on the Convergence of Gradient Methods and Their Development”, Žurnal vyčislitelʹnoj matematiki i matematičeskoj fiziki, 64:4 (2024), 587
O. S. Savchuk, M. S. Alkusa, F. S. Stonyakin, “O nekotorykh metodakh zerkalnogo spuska dlya zadach silno vypuklogo programmirovaniya s lipshitsevymi funktsionalnymi ogranicheniyami”, Kompyuternye issledovaniya i modelirovanie, 16:7 (2024), 1727–1746
M. S. Alkousa, F. S. Stonyakin, A. M. Abdo, M. M. Alcheikh, “Mirror Descent Methods with a Weighting Scheme for Outputs for Optimization Problems with Functional Constraints”, Rus. J. Nonlin. Dyn., 20:5 (2024), 727–745
S. S. Ablaev, F. S. Stonyakin, M. S. Alkousa, A. V. Gasnikov, “Adaptive Subgradient Methods for Mathematical Programming Problems with Quasiconvex Functions”, Proc. Steklov Inst. Math. (Suppl.), 323, suppl. 1 (2023), S1–S18
Rashid Yarullin, Igor Zabotin, Communications in Computer and Information Science, 1881, Mathematical Optimization Theory and Operations Research: Recent Trends, 2023, 44
A. Ivanova, F. Stonyakin, D. Pasechnyuk, E. Vorontsova, A. Gasnikov, “Adaptive mirror descent for the network utility maximization problem”, IFAC PAPERSONLINE, 53:2 (2020), 7851–7856
Mohammad S. Alkousa, Forum for Interdisciplinary Mathematics, Computational Mathematics and Applications, 2020, 47
M. S. Alkousa, “On some stochastic mirror descent methods for constrained online optimization problems”, Kompyuternye issledovaniya i modelirovanie, 11:2 (2019), 205–217
F. S. Stonyakin, M. Alkousa, A. N. Stepanov, A. A. Titov, “Adaptive mirror descent algorithms for convex and strongly convex optimization problems with functional constraints”, J. Appl. Industr. Math., 13:3 (2019), 557–574
Alexander A. Titov, Fedor S. Stonyakin, Alexander V. Gasnikov, Mohammad S. Alkousa, Communications in Computer and Information Science, 974, Optimization and Applications, 2019, 64