[1]张维中,张丽艳,潘振宽,等.一种基于标记点的近景摄影测量系统[J].东南大学学报(自然科学版),2006,36(5):741-745.[doi:10.3969/j.issn.1001-0505.2006.05.012]
 Zhang Weizhong,Zhang Liyan,Pan Zhenkuan,et al.Close-range photogrammetric system based on reference points[J].Journal of Southeast University (Natural Science Edition),2006,36(5):741-745.[doi:10.3969/j.issn.1001-0505.2006.05.012]
点击复制

一种基于标记点的近景摄影测量系统()
分享到:

《东南大学学报(自然科学版)》[ISSN:1001-0505/CN:32-1178/N]

卷:
36
期数:
2006年第5期
页码:
741-745
栏目:
计算机科学与工程
出版日期:
2006-09-20

文章信息/Info

Title:
Close-range photogrammetric system based on reference points
作者:
张维中12 张丽艳2 潘振宽1 王小平2 周玲2
1 青岛大学信息工程学院, 青岛 266071; 2 南京航空航天大学CAD/CAM工程研究中心, 南京 210016
Author(s):
Zhang Weizhong12 Zhang Liyan2 Pan Zhenkuan1 Wang Xiaoping2 Zhou Ling2
1 College of Information Engineering, Qingdao University, Qingdao 266071, China
2 Research Center of CAD/CAM Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 210016, China
关键词:
标记点匹配 编码与非编码点 亚像素 加权迭代特征算法 3D 重建
Keywords:
reference points matching coded and non-coded points subpixel weighted iterative eigen algorithm 3D reconstruction
分类号:
TP391
DOI:
10.3969/j.issn.1001-0505.2006.05.012
摘要:
给出了一种基于标记点的鲁棒三维重建摄影测量系统; 采用编码点和非编码点等标记点方式.为了减少不同图像间误匹配的概率,采用一种新的基于编码点的匹配方法,不同图像间非编码点的匹配从编码点开始,并通过相似性准则、模糊度准则和距离误差准则来剔除误匹配,可获得非常高的正确匹配率.采用一种新的基于标记点的加权迭代特征算法,用编码点恢复相机的投影矩阵,从而可以确定相机的外部姿态参数; 用非编码点恢复3D坐标.与已有的加权迭代特征算法比较,该算法避免了所有点参与计算相机的投影矩阵,运算速度更快.由于采用标记点的亚像素定位方法,提高了3D重建精度.实验结果表明,在3D重建方面,该系统是强壮和精确的.
Abstract:
A robust photogrammetric system for 3D reconstruction based on reference points is described. This method uses non-coded and coded points. To minimize the possibility of mismatch of non-coded points between different images, a new coded points based matching approach is proposed for the non-coded points matching. The non-coded points matching between different images start from the coded points and mismatch is eliminated by using rules of similarity,ambiguity and distance constraint error. The correct matching ratio is very high. A new weighted iterative eigen algorithm based on coded points was adopted. The projection matrix of camera is recovered by using the coded points so as to determine the external pose parameters of camera. The 3D coordinates are recovered with the non-coded points. Compared to previous methods, this algorithm does not involve the problem of computing projection matrix of camera with all the points and its computing speed is faster. 3D reconstruction precision is remarkably increased since the method of subpixel reference point center location is utilized. The experiments show that system is robust and precise in 3D reconstruction.

参考文献/References:

[1] Stefano L D,Marchionni M,Mattoccia S.A fast area-based stereo matching algorithm[J].Image and Vision Computing,2004,22(12):983-1005.
[2] Carcassoni M,Hancock E R.Correspondence matching with modal clusters[J]. IEEE Trans on Pattern Analysis and Machine Intelligence,2003,25(12):1609-1615.
[3] Galo M,Tozzi C L.Feature-point based matching:a sequential approach based on relaxation labeling and relative orientation[J]. Journal of WSCG,Science Press,2004,12(1):1-8.
[4] Chen G Q,Medioni G G.Practical algorithms for stratified structure-from-motion[J]. Image and Vision Computing,2002,20(2):103-123.
[5] Xu J,Fang Z P,Malcolm A A,et al.A robust close-range photogrammetric system for industrial metrology[C] //The 7th International Conference on Control,Automation,Robotics and Vision.Singapore:IEEE Computer Society Press,2002:114-119.
[6] Faugeras O.Three-dimensional computer vision-a geometric viewpoint[M].Cambridge,MA,USA:Massachusetts Institute of Technology Press,1993.
[7] Zhang Zhengyou,Deriche R,Faugeras O,et al.A robust technique for matching two uncalibrated images through the recovery of the unknown epipolar geometry[J].Artificial Intelligence,1995,78(1,2):87-119.
[8] Hartley R,Sturm P.Triangulation[J].Computer Vision and Image Understanding,1997,68(2):146-157.

备注/Memo

备注/Memo:
基金项目: 国家自然科学基金资助项目(50475041).
作者简介: 张维中(1963—),男,博士生,教授, zhangwz_01@163.com; 张丽艳(联系人),女,教授, 博士生导师, zhangly@nuaa.edu.cn.
更新日期/Last Update: 2006-09-20