[1] The Ministry of Water Resources of the People's Republic of China.The first national water resources census bulletin[J].China Water Resources,2013(7):1-3.(in Chinese)中华人民共和国水利部.第一次全国水利普查公报[J].中国水利,2013(7):1-3. [2] BLANCO A,PARDO B F,CAVALARO S,et al.Lessons learned about the diagnosis of pathologies in concrete dams:30 years of research and practice[J].Construction and Building Materials,2019,197(2):356-368. [3] LI Zongkun,GE Wei,WANG Juan,et al.Risk criteria and application on reservoir dams in China[J].Journal of Hydraulic Engineering,2015,46(5):567-583.(in Chinese)李宗坤,葛巍,王娟,等.中国水库大坝风险标准与应用研究[J].水利学报,2015,46(5):567-583. [4] MONTERO R,VICTORES J G,MARTINEZ S,et al.Past,present and future of robotic tunnel inspection[J].Automation in Construction,2015,59(1):99-112. [5] HUANG Zhen,FU Helin,CHEN Wei,et al.Damage detection and quantitative analysis of shield tunnel structure[J].Automation in Construction,2018,94(10):303-316. [6] TAO Xian,HOU Wei,XU De.A survey of surface defect detection methods based on deep learning[EB/OL].[2020-01-30].http://kns.cnki.net/kcms/detail/11.2109.TP.20200402.1101.002.html.(in Chinese)陶显,侯伟,徐德.基于深度学习的表面缺陷检测方法综述[EB/OL].[2020-01-30].http://kns.cnki.net/kcms/detail/11.2109.TP.20200402.1101.002.html. [7] ZHANG Lei,YANG Fan,ZHU Ying,et al.Road crack detection using deep convolutional neural network[C]//Proceedings of 2016 IEEE International Conference on Image Processing.Washington D.C.,USA:IEEE Press,2016:3708-3712. [8] CHA Y J,CHOI W,SUH G,et al.Autonomous structural visual inspection using region-based deep learning for detecting multiple damage types[J].Computer-Aided Civil and Infrastructure Engineering,2018,33(9):731-747. [9] HUANG Hongwei,LI Qingting,ZHANG Dongming.Deep learning based image recognition for crack and leakage defects of metro shield tunnel[J].Tunnelling and Underground Space Technology,2018,77(7):166-176. [10] HAO Huaying,ZHAO Kun,SU Pan,et al.A corneal nerve segmentation algorithm based on improved ResU-Net[J].Computer Engineering,2021,47(1):217-223.(in Chinese)郝华颖,赵昆,苏攀,等.一种基于改进ResU-Net的角膜神经分割算法[J].计算机工程,2021,47(1):217-223. [11] WU Z F,SHEN C H,HENGEL A V D.Wider or deeper:revisiting the ResNet model for visual recognition[J].Pattern Recognition,2019,90(11):119-133. [12] GHOLAMI A,KWON K,WU B C,et al.SqueezeNext:hardware-aware neural network design[C]//Proceedings of 2018 IEEE Conference on Computer Vision and Pattern Recognition Workshops.Washington D.C.,USA:IEEE Press,2018:52-59. [13] ARSALAN M,KIM D S,LEE M B,et al.FRED-Net:fully residual encoder-decoder network for accurate iris segmentation[J].Expert Systems with Applications,2019,122(1):217-241. [14] IBTEHAZ N,RAHMAN M S.MultiResUNet:rethinking the U-Net architecture for multimodal biomedical image segmentation[J].Neural Networks,2020,121(1):74-87. [15] LIN T Y,GOYAL P,GIRSHICK R,et al.Focal loss for dense object detection[J].IEEE Transactions on Pattern Analysis & Machine Intelligence,2018,99(7):2999-3007. [16] WEN Yandong,ZHANG Kaipeng,LI Zhifeng,et al.A discriminative feature learning approach for deep face recognition[C]//Proceedings of 2016 European Conference on Computer Vision.Berlin,Germany:Springer,2016:121-129. [17] WANG Runhan,LI Bing,TENG Qizhi.Core FIB-SEM image segmentation method based on convolutional neural network[J].Computer Engineering,2021,47(1):264-274.(in Chinese)王润涵,李兵,滕奇志.基于卷积神经网络的岩心FIB-SEM图像分割方法[J].计算机工程,2021,47(1):264-274. [18] VIJAY B,ALEX K,ROBERTO C.SegNet:a deep convolutional encoder-decoder architecture for image segmentation[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2017,39(12):2481-2495. [19] SUN Weiwei,WANG Ruisheng.Fully convolutional networks for semantic segmentation of very high resolution remotely sensed images combined with DSM[J].IEEE Geoscience and Remote Sensing Letters,2018,15(3):474-478. [20] WANG Junqiang,LI Jiansheng,ZHOU Huachun,et al.Typical element extraction method of remote sensing image based on Deeplabv3+ and CRF[J].Computer Engineering,2019,45(10):260-265,271.(in Chinese)王俊强,李建胜,周华春,等.基于Deeplabv3+与CRF的遥感影像典型要素提取方法[J].计算机工程,2019,45(10):260-265,271. |