[1] CADENA C, CARLONE L, CARRILLO H, et al.Past, present, and future of simultaneous localization and mapping:toward the robust-perception age[J].IEEE Transactions on Robotics, 2016, 32(6):1309-1332. [2] HUANG B C, ZHAO J, LIU J B.A survey of simultaneous localization and mapping[EB/OL].[2021-07-01].https://www.xueshufan.com/publication/2972879745. [3] WEISS S, SIEGWART R.Real-time metric state estimation for modular vision-inertial systems[C]//Proceedings of IEEE International Conference on Robotics and Automation.Washington D.C., USA:IEEE Press, 2011:4531-4537. [4] KNEIP L, WEISS S, SIEGWART R.Deterministic initialization of metric state estimation filters for loosely-coupled monocular vision-inertial systems[C]//Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems.Washington D.C., USA:IEEE Press, 2011:2235-2241. [5] BLOESCH M, BURRI M, OMARI S, et al.Iterated extended Kalman filter based visual-inertial odometry using direct photometric feedback[J].The International Journal of Robotics Research, 2017, 36(10):1053-1072. [6] LEUTENEGGER S, LYNEN S, BOSSE M, et al.Keyframe-based visual-inertial odometry using nonlinear optimization[J].The International Journal of Robotics Research, 2015, 34(3):314-334. [7] PUMAROLA A, VAKHITOV A, AGUDO A, et al.PL-SLAM:real-time monocular visual SLAM with points and lines[C]//Proceedings of IEEE International Conference on Robotics and Automation.Washington D.C., USA:IEEE Press, 2017:4503-4508. [8] WEI X Y, HUANG J, MA X Y.Real-time monocular visual SLAM by combining points and lines[C]//Proceedings of IEEE International Conference on Multimedia and Expo.Washington D.C., USA:IEEE Press, 2019:103-108. [9] ZHANG F K, RUI T, YANG C S, et al.LAP-SLAM:a line-assisted point-based monocular VSLAM[J].Electronics, 2019, 8(2):243-247. [10] CHENG S, YANG J, KANG Z, et al.A scene-assisted point-line feature based visual slam method for autonomous flight in unknown indoor environments[J].The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, 2019, 13:777-783. [11] ZUO X X, XIE X J, LIU Y, et al.Robust visual SLAM with point and line features[C]//Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems.Washington D.C., USA:IEEE Press, 2017:1775-1782. [12] GOMEZ OJEDA R, MORENO F A, ZUÑIGA NOËL D, et al.PL-SLAM:a stereo SLAM system through the combination of points and line segments[J].IEEE Transactions on Robotics, 2019, 35(3):734-746. [13] MUR-ARTAL R, MONTIEL J M M, TARDÓS J D.ORB-SLAM:a versatile and accurate monocular SLAM system[J].IEEE Transactions on Robotics, 2015, 31(5):1147-1163. [14] MUR ARTAL R, TARDÓS J D.ORB SLAM2:an open-source SLAM system for monocular, stereo, and RGB-D cameras[J].IEEE Transactions on Robotics, 2017, 33(5):1255-1262. [15] HE Y J, ZHAO J, GUO Y, et al.PL-VIO:tightly-coupled monocular visual-inertial odometry using point and line features[J].Sensors, 2018, 18(4):1159. [16] FU Q, WANG J L, YU H S, et al.PL-VINS:real-time monocular visual-inertial SLAM with point and line features[EB/OL].[2021-07-01].https://arxiv.org/abs/2009.07462. [17] ZHAO S B, FANG Z, LI H L, et al.A robust laser-inertial odometry and mapping method for large-scale highway environments[C]//Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems.Washington D.C., USA:IEEE Press, 2019:1285-1292. [18] SHAN T X, ENGLOT B, MEYERS D, et al.LIO-SAM:tightly-coupled lidar inertial odometry via smoothing and mapping[C]//Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems.Washington D.C., USA:IEEE Press, 2021:5135-5142. [19] SHAN T X, ENGLOT B, RATTI C, et al.LVI-SAM:tightly-coupled lidar-visual-inertial odometry via smoothing and mapping[C]//Proceedings of IEEE International Conference on Robotics and Automation.Washington D.C., USA:IEEE Press, 2021:5692-5698. [20] QIN T, LI P L, SHEN S J.VINS-MONO:a robust and versatile monocular visual-inertial state estimator[J].IEEE Transactions on Robotics, 2018, 34(4):1004-1020. [21] GALVEZ LÓPEZ D, TARDOS J D.Bags of binary words for fast place recognition in image sequences[J].IEEE Transactions on Robotics, 2012, 28(5):1188-1197. [22] LUCAS B D, KANADE T.An iterative image registration technique with an application to stereo vision[EB/OL].[2021-07-01].https://www.semanticscholar.org/paper/An-Iterative-Image-Registration-Technique-with-an-Lucas-Kanade/a06547951c97b2a32f23a6c2b5f79c8c75c9b9bd. [23] PONCE, HALL P.Multiple view geometry in computer vision[M].Cambridge, U K:Cambridge University Press, 2003. [24] ZHANG L L, KOCH R.An efficient and robust line segment matching approach based on LBD descriptor and pairwise geometric consistency[J].Journal of Visual Communication and Image Representation, 2013, 24(7):794-805. [25] 戴志诚, 李小年, 陈增照, 等.基于KNN算法的可变权值室内指纹定位算法[J].计算机工程, 2019, 45(6):310-314. DAI Z C, LI X N, CHEN Z Z, et al.Variable-weight Indoor fingerprinting localization algorithm based on KNN algorithm[J].Computer Engineering, 2019, 45(6):310-314.(in Chinese) [26] ZHANG G X, LEE J H, LIM J, et al.Building a 3D line-based map using stereo SLAM[J].IEEE Transactions on Robotics, 2015, 31(6):1364-1377. [27] FORSTER C, CARLONE L, DELLAERT F, et al.On-manifold preintegration for real-time visual-inertial odometry[J].IEEE Transactions on Robotics, 2016, 33(1):1-21. [28] CALONDER M, LEPETIT V, STRECHA C, et al.Brief:binary robust independent elementary features[C]//Proceedings of European Conference on Computer Vision.Berlin, Germany:Springer, 2010:778-792. [29] KAESS M, JOHANNSSON H, ROBERTS R, et al.iSAM2:incremental smoothing and mapping using the bayes tree[J].The International Journal of Robotics Research, 2012, 31(2):216-235. [30] ZHANG J, KAESS M, SINGH S.On degeneracy of optimization-based state estimation problems[C]//Proceedings of IEEE International Conference on Robotics and Automation.Washington D.C., USA:IEEE Press, 2016:809-816. |