Development of The Steepest Descent Method for Unconstrained Optimization of Nonlinear Function

Authors

  • Zakiah Evada Universitas Sumatera Utara
  • Tulus universitas sumatera utara
  • Elvina Herawati universitas sumatera utara

DOI:

10.33395/sinkron.v7i3.11596

Abstract

The q-gradient method used a Yuan step size for odd steps, and geometric recursion as an even step size (q-GY). This study aimed to accelerate convergence to a minimum point by minimizing the number of iterations, by dilating the parameter q to the independent variable and then comparing the results with three algorithms namely, the classical steepest descent (SD) method, steepest descent method with Yuan Steps (SDY), and q-gradient method with geometric recursion (q-G). The numerical results were presented in tables and graphs. The study used Rosenbrock function f(x)=〖(1-x_1)〗^2+100〖(x_2-〖x_1〗^2)〗^2  and determined μ=1,σ_0=0.5,β=0.999, the starting point (x_0) with a uniform distribution on the interval x_0= (-2.048, 2.048) in R^2, with 49 starting points (x_0) executed using the Python online compiler on a 64bit core i3 laptop. The maximum number of iterations was 58,679. Using tolerance limits as stopping criteria is 10-4 and the inequality 〖f(x〗^*)>f to get numerical results. q-GY method down ward movement towards the minimum point was better than the SD and SDY methods while the numerical results of the Rosenbrock function showed good enough performance to increase convergence to the minimum point

GS Cited Analysis

Downloads

Download data is not yet available.

References

A. C. Soterroni, R. L. Galski, M. C. Scarabello, and F. M. Ramos, “The q-G method : A q-version of the Steepest Descent method for global optimization,” Springerplus, vol. 4, no. 1, pp. 1–16, 2015, doi: 10.1186/s40064-015-1434-4.

B. P. Silalahi, D. Wungguli, and S. Guritman, “Steepest Descent Method with New Step Sizes,” Int. J. Math. Comput. Phys. Electr. Comput. Eng., vol. 9, no. 7, pp. 327–333, 2015.

É. J. C. Gouvêa, R. G. Regis, A. C. Soterroni, M. C. Scarabello, and F. M. Ramos, “Global optimization using q-gradients,” Eur. J. Oper. Res., vol. 251, no. 3, pp. 727–738, 2016, doi: 10.1016/j.ejor.2016.01.001.

J. Watt, R. Borhani, and A. K. Katsaggelos, Machine learning refined: Foundations, algorithms, and applications. 2016.

M. Mishra, Samei, Chakraborty, and Ram, “On q-variant of Dai–Yuan conjugate gradient algorithm for unconstrained optimization problems,” pp. 2471–2496, 2021.

S. S. Djordjevic, “Some Unconstrained Optimization Methods,” Appl. Math., pp. 1–28, 2019, doi: 10.5772/intechopen.83679.

Soterroni, Galski, and Ramos, “The q-gradient vector for unconstrained continuous optimization problems,” 2011, pp. 365–370.

Downloads


Crossmark Updates

How to Cite

Evada, Z., Tulus, & Herawati, E. (2022). Development of The Steepest Descent Method for Unconstrained Optimization of Nonlinear Function. Sinkron : Jurnal Dan Penelitian Teknik Informatika, 6(3), 2052-2060. https://doi.org/10.33395/sinkron.v7i3.11596