[1] 周治国,曹江微,邸顺帆. 3D激光雷达SLAM算法综述[J].仪器仪表学报, 2021, 42(9):13-27. ZHOU Z G, CAO J W, DI S F. Overview of 3D lidar SLAM algorithms[J]. Chinese Journal of Scientific Instrument, 2021, 42(9):13-27.(in Chinese) [2] 曾庆化,罗怡雪,孙克诚,等.视觉及其融合惯性的SLAM技术发展综述[J].南京航空航天大学学报, 2022, 54(6):1007-1020. ZENG Q H, LUO Y X, SUN K C, et al. Review on SLAM technology development for vision and its fusion of inertial information[J]. Journal of Nanjing University of Aeronautics&Astronautics, 2022, 54(6):1007-1020.(in Chinese) [3] ZHANG J Y, YAO Y X, DENG B L. Fast and robust iterative closest point[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2022, 44(7):3450-3466. [4] SEGAL A, HAEHNEL D, THRUN S. Generalized-ICP[EB/OL].[2023-04-05]. https://www.semanticscholar.org/paper/Generalized-ICP-Segal-H%C3% A4hnel/b352b3a7f1068b2d562ba 12a446628397dfe8a77. [5] ZHANG J, SINGH S. LOAM:lidar odometry and mapping in real-time[EB/OL].[2023-04-05].https://www.researchgate.net/publication/282704722_LOAM_Lidar_Odometry_and_Mapping_in_Real-time. [6] 王金科,左星星,赵祥瑞,等.多源融合SLAM的现状与挑战[J].中国图象图形学报, 2022, 27(2):368-389. WANG J K, ZUO X X, ZHAO X R, et al. Review of multi-source fusion SLAM:current status and challenges[J]. Journal of Image and Graphics, 2022, 27(2):368-389.(in Chinese) [7] SHAN T X, ENGLOT B. LeGO-LOAM:lightweight and ground-optimized lidar odometry and mapping on variable terrain[C]//Proceedings of 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems. New York,USA:ACM Press,2018:4758-4765. [8] YE H Y, CHEN Y Y, LIU M. Tightly coupled 3D lidar inertial odometry and mapping[C]//Proceedings of International Conference on Robotics and Automation. Washington D.C.,USA:IEEE Press,2019:3144-3150. [9] SHAN T X, ENGLOT B, MEYERS D, et al. LIO-SAM:tightly-coupled lidar inertial odometry via smoothing and mapping[C]//Proceedings of 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems. Washington D.C.,USA:IEEE Press,2020:5135-5142. [10] XU W, ZHANG F. FAST-LIO:a fast, robust LiDAR-inertial odometry package by tightly-coupled iterated Kalman filter[J]. IEEE Robotics and Automation Letters, 2021, 6(2):3317-3324. [11] 张金凤,石朝侠,王燕清.动态场景下基于视觉特征的SLAM方法[J].计算机工程, 2020, 46(10):95-102. ZHANG J F, SHI C X, WANG Y Q. SLAM method based on visual features in dynamic scene[J]. Computer Engineering, 2020, 46(10):95-102.(in Chinese) [12] SCHAUER J, NUCHTER A. The peopleremover-removing dynamic objects from 3-D point cloud data by traversing a voxel occupancy grid[J]. IEEE Robotics and Automation Letters, 2018, 3(3):1679-1686. [13] QIAN C, XIANG Z, WU Z, et al. RF-LIO:removal-first tightly-coupled lidar inertial odometry in high dynamic environments[EB/OL].[2023-04-05].https://arxiv.org/abs/2206.09463. [14] PFREUNDSCHUH P, HENDRIKX H F C, REIJGWART V, et al. Dynamic object aware LiDAR SLAM based on automatic generation of training data[C]//Proceedings of IEEE International Conference on Robotics and Automation. Washington D.C.,USA:IEEE Press,2021:11641-11647. [15] THRUN S. Probabilistic robotics[J]. Communications of the ACM, 2002, 45(3):52-57. [16] HASELICH M, JOBGEN B, WOJKE N, et al. Confidence-based pedestrian tracking in unstructured environments using 3D laser distance measurements[C]//Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems. Washington D.C.,USA:IEEE Press,2014:4118-4123. [17] ARTHUR D, VASSILVITSKII S. K-means++the advantages of careful seeding[C]//Proceedings of the 18th Annual ACM-SIAM Symposium on Discrete Algorithms. New York,USA:ACM Press,2007:1027-1035. [18] PREMEBIDA C, LUDWIG O, NUNES U. Exploiting LIDAR-based features on pedestrian detection in urban scenarios[C]//Proceedings of the 12th IEEE International Conference on Intelligent Transportation Systems. Washington D.C.,USA:IEEE Press,2009:1-6. [19] KIDONO K, MIYASAKA T, WATANABE A, et al. Pedestrian recognition using high-definition LIDAR[C]//Proceedings of IEEE Intelligent Vehicles Symposium. Washington D.C.,USA:IEEE Press,2011:405-410. [20] SOLA J. Quaternion kinematics for the error-state Kalman filter[EB/OL].[2023-04-05].https://arxiv.org/abs/1711.02508. [21] XU W, CAI Y X, HE D J, et al. FAST-LIO2:fast direct LiDAR-inertial odometry[J]. IEEE Transactions on Robotics, 2022, 38(4):2053-2073. [22] ZHANG L T, HELMBERGER M, FU L F T, et al. Hilti-Oxford dataset:a millimeter-accurate benchmark for simultaneous localization and mapping[J]. IEEE Robotics and Automation Letters, 2023, 8(1):408-415. [23] LIM H, HWANG S, MYUNG H. ERASOR:egocentric ratio of pseudo occupancy-based dynamic object removal for static 3D point cloud map building[J]. IEEE Robotics and Automation Letters, 2021, 6(2):2272-2279. [24] KIM G, KIM A. Remove, then revert:static point cloud map construction using multiresolution range images[C]//Proceedings of 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems. Washington D.C.,USA:IEEE Press,2020:10758-10765. [25] SCHMID L, ANDERSSON O, SULSER A, et al. Dynablox:real-time detection of diverse dynamic objects in complex environments[EB/OL].[2023-04-05].https://arxiv.org/abs/2304.10049. [26] FAN T X, SHEN B W, CHEN H, et al. DynamicFilter:an online dynamic objects removal framework for highly dynamic environments[C]//Proceedings of International Conference on Robotics and Automation. Washington D.C.,USA:IEEE Press,2022:7988-7994. |