An Accelerated Convex Optimization Algorithm with Line Search and Applications in Machine Learning

dc.contributor.authorDawan Chumpungam
dc.contributor.authorPanitarn Sarnmeta
dc.contributor.authorSuthep Suantai
dc.date.accessioned2025-07-21T06:06:57Z
dc.date.issued2022-04-30
dc.description.abstractIn this paper, we introduce a new line search technique, then employ it to construct a novel accelerated forward–backward algorithm for solving convex minimization problems of the form of the summation of two convex functions in which one of these functions is smooth in a real Hilbert space. We establish a weak convergence to a solution of the proposed algorithm without the Lipschitz assumption on the gradient of the objective function. Furthermore, we analyze its performance by applying the proposed algorithm to solving classification problems on various data sets and compare with other line search algorithms. Based on the experiments, the proposed algorithm performs better than other line search algorithms.
dc.identifier.doi10.3390/math10091491
dc.identifier.urihttps://dspace.kmitl.ac.th/handle/123456789/11277
dc.subjectLine search
dc.subjectLine (geometry)
dc.subjectFrank�Wolfe algorithm
dc.subjectMinification
dc.subjectProximal Gradient Methods
dc.subject.classificationSparse and Compressive Sensing Techniques
dc.titleAn Accelerated Convex Optimization Algorithm with Line Search and Applications in Machine Learning
dc.typeArticle

Files

Collections