|
NONLINEAR SYSTEMS IN ROBOTICS
On Quasi-Convex Smooth Optimization Problems by a Comparison Oracle
A. V. Gasnikovabcd, M. S. Alkousabd, A. V. Lobanovedf, Y. V. Dorndg, F. S. Stonyakinhbd, I. A. Kuruzovb, S. R. Singhi a Steklov Mathematical Institute of Russian Academy of Sciences,
ul. Gubkina 8, Moscow, 119991 Russia
b Innopolis University,
ul. Universitetskaya 1, Innopolis, 420500 Russia
c Caucasus Mathematical Center, Adyghe State University,
ul. Pervomaiskaya 208, Maykop, 385000 Russia
d Moscow Institute of Physics and Technology,
Institutskiy per. 9, Dolgoprudny, 141701 Russia
e Skolkovo Institute of Science and Technology,
Bolshoy Boulevard 30, bld. 1, Moscow, 121205 Russia
f ISP RAS Research Center for Trusted Artificial Intelligence,
ul. Alexandra Solzhenitsyna 25, Moscow, 109004 Russia
g Institute for Artificial Intelligence, Lomonosov Moscow State University,
Lomonosovsky pr. 27/1, Moscow, 119192 Russia
h V. I. Vernadsky Crimean Federal University,
Av. Academician Vernandsky 4, Simferopol, 295007 Russia
i Times School of Media, Bennett University,
Plot Nos 8, 11, Greater Noida, Uttar Pradesh, 201310 India
Аннотация:
Frequently, when dealing with many machine learning models, optimization problems appear to be challenging due to a limited understanding of the constructions and characterizations
of the objective functions in these problems. Therefore, major complications arise when dealing
with first-order algorithms, in which gradient computations are challenging or even impossible in
various scenarios. For this reason, we resort to derivative-free methods (zeroth-order methods).
This paper is devoted to an approach to minimizing quasi-convex functions using a recently
proposed (in [56]) comparison oracle only. This oracle compares function values at two points
and tells which is larger, thus by the proposed approach, the comparisons are all we need to solve
the optimization problem under consideration. The proposed algorithm to solve the considered
problem is based on the technique of comparison-based gradient direction estimation and the
comparison-based approximation normalized gradient descent. The normalized gradient descent
algorithm is an adaptation of gradient descent, which updates according to the direction of the
gradients, rather than the gradients themselves. We proved the convergence rate of the proposed
algorithm when the objective function is smooth and strictly quasi-convex in Rn, this algorithm
needs O(nD2ε2log(nDε))
comparison queries to find an ε-approximate of the optimal solution,
where D is an upper bound of the distance between all generated iteration points and an optimal
solution.
Ключевые слова:
quasi-convex function, gradient-free algorithm, smooth function, comparison oracle, normalized gradient descent
Поступила в редакцию: 13.11.2024 Принята в печать: 26.11.2024
Образец цитирования:
A. V. Gasnikov, M. S. Alkousa, A. V. Lobanov, Y. V. Dorn, F. S. Stonyakin, I. A. Kuruzov, S. R. Singh, “On Quasi-Convex Smooth Optimization Problems by a Comparison Oracle”, Rus. J. Nonlin. Dyn., 20:5 (2024), 813–825
Образцы ссылок на эту страницу:
https://www.mathnet.ru/rus/nd924 https://www.mathnet.ru/rus/nd/v20/i5/p813
|
Статистика просмотров: |
Страница аннотации: | 42 | PDF полного текста: | 12 | Список литературы: | 14 |
|