相似文献/References:
[1]吴庆棋 林江云.基于聚类优化GMM提高说话人识别性能的研究[J].计算机技术与发展,2009,(04):35.
WU Qing-qi,LIN Jiang-yun.A Study on GMM Optimization with Clustering for Improving Speaker Recognition[J].,2009,(02):35.
[2]但志平 郑胜.最小二乘向量机在说话人识别中的应用[J].计算机技术与发展,2007,(05):30.
DAN Zhi-ping,ZHENG Sheng.Application of LS - SVM in Speaker Recognition[J].,2007,(02):30.
[3]张华 裘雪红.说话人识别中LPCCEP倒谱分量的相对重要性[J].计算机技术与发展,2006,(04):67.
ZHANG Hua,QIU Xue-hong.On the Importance of Components of LPCCEP in Speaker Recognition[J].,2006,(02):67.
[4]于云,周伟栋. 基于稀疏表示的鲁棒性说话人识别系统[J].计算机技术与发展,2015,25(12):41.
YU Yun,ZHOU Wei-dong. Robust Speaker Recognition System Based on Sparse Representation[J].,2015,25(02):41.
[5]李燕,陶定元,林乐. 基于DTW模型补偿的伪装语音说话人识别研究[J].计算机技术与发展,2017,27(01):93.
LI Yan-ping,TAO Ding-yuan,LIN Le. Study on Electronic Disguised Voice Speaker Recognition Based on DTW Model Compensation[J].,2017,27(02):93.
[6]李姗,徐珑婷. 基于语谱图提取瓶颈特征的情感识别算法研究[J].计算机技术与发展,2017,27(05):82.
LI Shan,XU Long-ting. Research on Emotion Recognition Algorithm Based on Spectrogram Feature Extraction of Bottleneck Feature[J].,2017,27(02):82.
[7]林舒都,邵曦.基于i-vector和深度学习的说话人识别[J].计算机技术与发展,2017,27(06):66.
LIN Shu-du,SHAO Xi. Speaker Recognition with i-vector and Deep Learning[J].,2017,27(02):66.
[8]邱东,刘德雨.基于模糊深度学习网络的行人检测方法[J].计算机技术与发展,2018,28(10):22.[doi:10.3969/ j. issn.1673-629X.2018.10.005]
QIU Dong,LIU De-yu.A Pedestrian Detection Method Based on Fuzzy Depth Learning Network[J].,2018,28(02):22.[doi:10.3969/ j. issn.1673-629X.2018.10.005]
[9]强 晗,郭亚兰,田礼明.基于深度置信网络的恶意代码检测方法研究[J].计算机技术与发展,2019,29(07):93.[doi:10. 3969 / j. issn. 1673-629X. 2019. 07. 019]
QIANG Han,GUO Ya-lan,TIAN Li-ming.Research on Malicious Code Detection Based on Deep Belief Networks[J].,2019,29(02):93.[doi:10. 3969 / j. issn. 1673-629X. 2019. 07. 019]
[10]单新文,李 萌,陶晔波,等.基于深度置信网络的用电量预测方法研究[J].计算机技术与发展,2021,31(增刊):177.[doi:10. 3969 / j. issn. 1673-629X. 2021. S. 036]
SHAN Xin-wen,LI Meng,TAO Ye-bo,et al.Research on Electricity Consumption Forecasting Method Based on Deep Belief Network[J].,2021,31(02):177.[doi:10. 3969 / j. issn. 1673-629X. 2021. S. 036]