相似文献/References:
[1]黄艳 赵越.3D靶标的摄像机三步标定算法与实现[J].计算机技术与发展,2010,(01):135.
HUANG Yan,ZHAO Yue.Algorithm and Realization of Three-step Camera Calibration Based on 3D-Target[J].,2010,(03):135.
[2]付海洋 牛连强 刘守琳.一种基于平面模板的单应矩阵求解方法[J].计算机技术与发展,2010,(04):69.
FU Hai-yang,NIU Lian-qiang,LIU Shou-lin.A Solving Homography Matrix Method Based on Planar Pattern[J].,2010,(03):69.
[3]张铖伟 王彪 徐贵力.摄像机标定方法研究[J].计算机技术与发展,2010,(11):174.
ZHANG Cheng-wei,WANG Biao,XU Gui-li.A Study on Classification of Camera Calibration Methods[J].,2010,(03):174.
[4]杨晟,李学军,王珏,等.连续尺度复合分析核线重排列影像准稠密匹配[J].计算机技术与发展,2013,(04):111.
YANG Sheng,LI Xue-jun,WANG Jue,et al.Continuous Scale Multi-change Detecting Quasi-dense Matching for Epipolar Resample Images[J].,2013,(03):111.
[5]卢振宇,郭星,魏赛,等.基于计算机视觉的虚拟安全空间预警技术[J].计算机技术与发展,2014,24(02):237.
LU Zhen-yu,GUO Xing,WEI Sai,et al.A Surveillance Technology for Virtual Security Space Based on Computer Vision[J].,2014,24(03):237.
[6]李孟,周波,孟正大,等. 三目立体相机的标定研究[J].计算机技术与发展,2015,25(02):69.
LI Meng,ZHOU Bo,MENG Zheng-da,et al. Study on Trinocular Stereo Camera Calibration[J].,2015,25(03):69.
[7]施泽浩,赵启军.基于全卷积网络的目标检测算法[J].计算机技术与发展,2018,28(05):55.[doi:10.3969/j.issn.1673-629X.2018.05.013]
SHI Ze-hao,ZHAO Qi-jun.Object Detection Algorithm Based on Fully Convolutional Neural Network[J].,2018,28(03):55.[doi:10.3969/j.issn.1673-629X.2018.05.013]
[8]程龙乐[][],许金林[],李皙茹[][],等. 基于图像处理的跑步机速度自适应技术研究[J].计算机技术与发展,2016,26(10):92.
CHENG Long-le[][],XU Jin-lin[],LI Xi-ru[][],et al. Research on Speed-adaptive Technology of Treadmill Based on Image Processing[J].,2016,26(03):92.
[9]严一鸣[],郭星[]. 基于计算机视觉的交互式电子沙盘系统研究[J].计算机技术与发展,2017,27(06):195.
YAN Yi-ming[],GUO Xing[]. Investigation on Interactive Electronic Sand Table System with Computer Vision[J].,2017,27(03):195.
[10]汪 ?涛,成孝刚,李德志,等.基于霍夫变换与角点检测的叶脉特征提取算法[J].计算机技术与发展,2019,29(11):159.[doi:10. 3969 / j. issn. 1673-629X. 2019. 11. 032]
WANG Tao,CHENG Xiao-gang,LI De-zhi,et al.A Feature Extraction Algorithm for Leaf Vein Based on Hough Transform and Corner Detection[J].,2019,29(03):159.[doi:10. 3969 / j. issn. 1673-629X. 2019. 11. 032]