Summary: | As society advances, machine learning holds increasing significance. Optimization, a crucial aspect in machine learning, has garnered considerable research attention. Addressing optimization challenges has become pivotal as models grow in complexity alongside the exponential rise in data volume. In the existing algorithms like stochastic gradient descent (SGD), a common practice is to reduce step sizes or manually adjust step sizes which is inappropriate and time-consuming. In order to address this issue, researchers have put significant efforts, such as adopting the Barzilai-Borwein (BB) method. However, the BB method has its drawbacks, with the denominator potentially approaching zero or even becoming negative. In order to address this problem, this study uses the Positive Defined Stabilized Barzilai-Borwein (PDSBB) method and combined SGD algorithm with the method to create new algorithms, namely SGD-PDSBB. Following that, the algorithm’s convergence is analyzed. Subsequently, its effectiveness is confirmed through numerical experiments, where is compared to the original SGD algorithm, as well as SGD-BB, in terms of step size, sub-optimality, and classification accuracy. The numerical experiments indicate that the new algorithm exhibits numerical performance similar to SGD or SGD-BB on some datasets, and on some other datasets, the new algorithms even perform better. © (2025), (International Association of Engineers). All rights reserved.
|