[1]江雨燕,吕 魏,李 平,等.基于无监督的非对称度量学习优化行人再识别[J].计算机技术与发展,2022,32(09):126-133.[doi:10. 3969 / j. issn. 1673-629X. 2022. 09. 020]
 JIANG Yu-yan,LYU Wei,LI Ping,et al.Unsupervised Asymmetric Metric Learning for Person Re-identification Optimization[J].,2022,32(09):126-133.[doi:10. 3969 / j. issn. 1673-629X. 2022. 09. 020]
点击复制

基于无监督的非对称度量学习优化行人再识别()
分享到:

《计算机技术与发展》[ISSN:1006-6977/CN:61-1281/TN]

卷:
32
期数:
2022年09期
页码:
126-133
栏目:
人工智能
出版日期:
2022-09-10

文章信息/Info

Title:
Unsupervised Asymmetric Metric Learning for Person Re-identification Optimization
文章编号:
1673-629X(2022)09-0126-08
作者:
江雨燕1 吕 魏1 李 平2 邵 金1
1. 安徽工业大学 管理科学与工程学院,安徽 马鞍山 243032
2. 南京邮电大学 计算机学院,江苏 南京 210023
Author(s):
JIANG Yu-yan1 LYU Wei1 LI Ping2 SHAO Jin1
1. School of Management Science and Engineering,Anhui University of Technology,Ma’anshan 243032,China
2. School of Computer Science,Nanjing University of Posts and Telecommunications,Nanjing 210023,China
关键词:
无监督学习非对称度量学习KL 散度行人再识别
Keywords:
unsupervised learningasymmetrymetric learningKullback-Leiblerpersonal re-identification
分类号:
TP391
DOI:
10. 3969 / j. issn. 1673-629X. 2022. 09. 020
摘要:
行人再识别是对同一行人不同视图进行相似性匹配的技术,针对匹配过程中出现的不同相机、相机角度以及行人姿势、有无光照和障碍物等因素结合度量学习进行相似性度量。 随着外界条件的变化,由于忽略特定视图特征易导致部分信息丢失,视图有可能展现出不同的分布。 结合非对称的无监督度量学习方法 UAML( Unsupervised Asymmetric MetricLearning) 把视图分布分为共享视图和特定视图,共享视图提取共享特征,特定视图提取与视图相关联的特征投影到公共子空间中,再通过 K-means 聚类,KL( Kullback – Leibler) 散度对其进行优化,这样得到的子空间样本具有相同分布及全面的表示形式。该方法在 VIPeR 数据集、CUHK01 数据集、Market-1501 数据集上进行了实验测试,使用累计匹配曲线(Cumulative Match Characteristic CMC) 、Rank-1 精度和平均精度曲线等指标来衡量算法的性能。其中,秩-1 精度在不同数据上分别达到了 40.25% 、56. 6% / 58. 09% 、61.67% 。 该方法主要通过无监督非对称度量学习方法结合 KL 散度来进一步对行人再识别方法进行优化,通过实验进行比较来验证该方法的有效性,结果表明在行人再识别应用中具有更优的识别精度。
Abstract:
Person re- recognition is a technology of similarity matching for different views of the same pedestrian. In the process of matching,different cameras,camera angles,pedestrian posture,illumination and obstacles are combined with metric learning to measure the similarity. With the change of external conditions,some information may be lost due to the neglect of specific view features,and the view may show different distribution. Combined with the unsupervised asymmetric metric learning ( UAML) ,the view distribution is divided into shared view and specific view. Shared view extracts shared features,specific view extracts features associated with view and projects them into common subspace. Then?
K-means and Kullback-Leibler ( KL) divergence are used to optimize the view distribution.The subspace samples obtained in this way have the same distribution and comprehensive representation. The method was tested onVIPeR data set,CUHK01 data set and Market-1501 data set,and the performance of the algorithm was measured by cumulative match characteristic ( CMC) ,rank-1 precision and average precision curve. The accuracy of rank-1 is 40. 25% ,56. 6% / 58. 09% ,61. 67%respectively. The unsupervised asymmetric metric learning method combined with KL divergence is used to further optimize the pedestrian re recognition method. The effectiveness of the method is verified through the comparison of experiments. The results show that the proposed method has better recognition accuracy in person re-recognition application.

相似文献/References:

[1]缪宇杰,吴智钧,宫 婧.基于3D 卷积的视频错帧筛选方法[J].计算机技术与发展,2018,28(05):179.[doi:10.3969/ j. issn.1673-629X.2018.05.040]
 MIAO Yu-jie,WU Zhi-jun,GONG Jing.A Wrong Temporal-order Frames Identification Method Based on 3D Convolution[J].,2018,28(09):179.[doi:10.3969/ j. issn.1673-629X.2018.05.040]
[2]蒋文杰,罗晓曙*,戴沁璇.一种改进的生成对抗网络的图像上色方法研究[J].计算机技术与发展,2020,30(07):56.[doi:10. 3969 / j. issn. 1673-629X. 2020. 07. 013]
 JIANG Wen-jie,LUO Xiao-shu*,DAI Qin-xuan.Research on an Improved Method of Generative Adversarial Networks Image Coloring[J].,2020,30(09):56.[doi:10. 3969 / j. issn. 1673-629X. 2020. 07. 013]
[3]陈 杰,周梓豪*,吴军辉*.农产品评价观点抽取和情感识别系统设计实现[J].计算机技术与发展,2023,33(08):116.[doi:10. 3969 / j. issn. 1673-629X. 2023. 08. 017]
 CHEN Jie,ZHOU Zi-hao*,WU Jun-hui*.Design and Implementation of Agricultural Product Review Opinion Mining and Sentiment Recognition System[J].,2023,33(09):116.[doi:10. 3969 / j. issn. 1673-629X. 2023. 08. 017]

更新日期/Last Update: 2022-09-10