Proceedings of International Conference on Applied Innovation in IT
2020/03/10, Volume 8, Issue 1, pp.49-53
The Steepest Descent Method Using the Empirical Mode Gradient Decomposition
Vasiliy Esaulov, Roman Sinetsky
Abstract: The aim of the article is to study the possibility of improving gradient optimization methods. The leading approach to the chosen concept is based on the possibility of a featured description of the gradient that sets the direction of the search for a solution. A modification of the method of steepest descent of global optimization based on the Hilbert-Huang transform is proposed. The proposed solution is based on the decomposition of the gradient of the objective function into empirical modes. The main results of the work are iterative optimization methods, in which, in addition to the gradient, its empirical modes are also taken into account. New estimates of the descent step are obtained, which could not be deduced in the classical formulation of the steepest descent method. Their correctness is due to the fact that in the absence of the possibility of gradient decomposition, they are reduced to existing estimates for the steepest descent method. The theoretical significance of the results lies in the possibility of expanding the existing gradient methods by a previously not used gradient description method. The practical significance is that the proposed recommendations can help accelerate the convergence of gradient methods and improve the accuracy of their results. Using the Python language, computational experiments were carried out, as a result of which the adequacy of the proposed method and its robustness were confirmed.
Keywords: Steepest Descent Method, Gradient Descent Method, Empirical Mode Decomposition, Optimization Problem
Copyright © 2013-2020 Leonid Mylnikov. All rights reserved.