[1]王梅,任怡果*,刘勇,等.求解多核学习的自适应随机递归梯度下降法[J].计算机技术与发展,2025,(07):93-99.[doi:10.20165/j.cnki.ISSN1673-629X.2025.0038]
 WANG Mei,REN Yi-guo*,LIU Yong,et al.Adaptive Stochastic Recursive Gradient Algorithm for Multiple Kernel Learning[J].,2025,(07):93-99.[doi:10.20165/j.cnki.ISSN1673-629X.2025.0038]
点击复制

求解多核学习的自适应随机递归梯度下降法()

《计算机技术与发展》[ISSN:1006-6977/CN:61-1281/TN]

卷:
期数:
2025年07期
页码:
93-99
栏目:
人工智能
出版日期:
2025-07-10

文章信息/Info

Title:
Adaptive Stochastic Recursive Gradient Algorithm for Multiple Kernel Learning
文章编号:
1673-629X(2025)07-0093-07
作者:
王梅12任怡果1*刘勇3王志宝14
1. 东北石油大学 计算机与信息技术学院,黑龙江 大庆 163318;
2. 黑龙江省石油大数据与智能分析重点实验室,黑龙江 大庆 163318;
3. 中国人民大学 高瓴人工智能学院,北京 100049;
4. 东北石油大学 环渤海能源研究院,河北 秦皇岛 066004
Author(s):
WANG Mei12REN Yi-guo1*LIU Yong3WANG Zhi-bao14
1. School of Computer and Information Technology,Northeast Petroleum University,Daqing 163318,China;
2. Heilongjiang Key Laboratory of Petroleum Big Data and Intelligent Analysis,Daqing 163318,China;
3. Gaoling School of Artificial Intelligence,Renmin University of China,Beijing 100049,China;
4. Bohai Rim Energy Research Institute,Northeast Petroleum University,Qinhuangdao 066004,China
关键词:
多核学习随机递归梯度下降法随机Polyak步长小批量凸优化
Keywords:
multiple kernel learningstochastic recursive gradient algorithmstochastic Polyak step-sizemini batchconvex optimization
分类号:
TP391
DOI:
10.20165/j.cnki.ISSN1673-629X.2025.0038
摘要:
针对随机递归梯度法(SARAH)求解多核学习(MKL)的不足之处,如收敛速度缓慢以及计算成本高等问题,该文提出一种改进算法———基于随机 Polyak 步长(SPS)的小批量随机递归梯度下降算法(SPS-MSARAH)来求解多核学习优化问题。 首先将小批量方法引入随机方差缩减类算法中,选取一个固定大小的样本集代替单个训练样本计算 SARAH 的梯度,降低传统随机梯度下降算法使用单个样本计算梯度导致较大的波动和不稳定性所带来的方差。 在此基础上,使用随机 Polyak 步长自适应地更新小批量 SARAH 的步长,使得优化过程更加灵活和鲁棒,从而解决随机优化算法中步长选取的难题。 为了验证该算法的有效性,在标准数据集上进行了详细的数值实验。 实验结果显示,在求解大规模多核学习优化问题时,SPS-MSARAH 算法不仅显著提高了收敛速度,还有效降低了计算复杂度。 此外,对初始参数的敏感性问题也得到了很好的克服,展现出良好的鲁棒性。
Abstract:
Aiming at the shortcomings of the stochastic recursive gradient algorithm (SARAH) for solving the multiple kernel learning ( MKL),such as the slow convergence speed and the high computational cost,we propose an improved algorithm-the stochastic recursive gradient algorithm based on stochastic Polyak step size (SPS) with mini-batch (SPS-MSARAH) to solve the multiple kernel learning optimization problem. Firstly,the mini-batch method is introduced into the stochastic variance reduction class of algorithms,and a fixed-size sample set is selected to compute the gradient of SARAH instead of a single training sample,to reduce the variance caused by the large fluctuation and instability due to the use of a single sample to compute the gradient in traditional stochastic gradient descent algorithms. On this basis,the step size of mini-batch SARAH is adaptively updated using stochastic Polyak step size,which makes the optimization process more flexible and robust,thus solving the problem of step size selection in stochastic optimization algorithms. In order to verify the effectiveness of the proposed algorithm,detailed numerical experiments are carried out on standard data sets. The ex-perimental results show that the proposed SPS - MSARAH algorithm not only significantly improves the convergence speed,but also effectively reduces the computational complexity when solving large-scale multicore learning optimization problems. In addition,the sen-sitivity problem to the initial parameters is well overcome,showing good robustness.

相似文献/References:

[1]江伟,潘昊.基于优化的多核学习方法的Web文本分类的研究[J].计算机技术与发展,2013,(10):80.
 JIANG Wei[],PAN Hao[].Research of Web Document Classification Based on Optimized Multiple Kernel Learning Method[J].,2013,(07):80.
[2]黄 琳,荆晓远,董西伟.基于多核集成学习的跨项目软件缺陷预测[J].计算机技术与发展,2019,29(06):27.[doi:10. 3969 / j. issn. 1673-629X. 2019. 06. 006]
 HUANG Lin,JING Xiao-yuan,DONG Xi-wei.Cross-project Software Defect Prediction Based on Multiple Kernel Ensemble Learning[J].,2019,29(07):27.[doi:10. 3969 / j. issn. 1673-629X. 2019. 06. 006]
[3]高雅田,贾斯淇.基于Gram-Schmidt正交化和HSIC的核函数选择方法[J].计算机技术与发展,2024,34(06):148.[doi:10.20165/j.cnki.ISSN1673-629X.2024.0086]
 GAO Ya-tian,JIA Si-qi.A Kernel Selection Method Based on Gram-Schmidt Orthogonalization and HSIC[J].,2024,34(07):148.[doi:10.20165/j.cnki.ISSN1673-629X.2024.0086]

更新日期/Last Update: 2025-07-10