[1] SINGH S, GYAOUROVA A, BEBIS G, et al.Infrared and visible image fusion for face recognition[EB/OL].[2021-05-07].http://citeseerx.ist.psu.edu/viewdoc/download;jsessionid=12F273C9531B20ED4AE12509A4F42504?doi=10.1.1.9.5885&rep=rep1&type=pdf. [2] SIMONE G, FARINA A, MORABITO F C, et al.Image fusion techniques for remote sensing applications[J].Information Fusion, 2002, 3(1):3-15. [3] MA J, MA Y, LI C.Infrared and visible image fusion methods and applications:a survey[J].Information Fusion, 2019, 45:153-178. [4] CHEN J, LI X J, LUO L B, et al.Infrared and visible image fusion based on target-enhanced multiscale transform decomposition[J].Information Sciences, 2020, 508:64-78. [5] 沈瑜, 陈小朋, 刘成, 等.基于混合模型驱动的红外与可见光图像融合[J].控制与决策, 2021, 36(9):2143-2151. SHEN Y, CHEN X P, LIU C, et al.Infrared and visible image fusion based on hybrid model driving[J].Control and Decision, 2021, 36(9):2143-2151.(in Chinese) [6] LIU Y, CHEN X, CHENG J, et al.Infrared and visible image fusion with convolutional neural networks[J].International Journal of Wavelets, Multiresolution and Information Processing, 2018, 16(3):1-10. [7] MA J Y, YU W, LIANG P W, et al.FusionGAN:a generative adversarial network for infrared and visible image fusion[J].Information Fusion, 2019, 48:11-26. [8] LI J, HUO H T, LIU K J, et al.Infrared and visible image fusion using dual discriminators generative adversarial networks with Wasserstein distance[J].Information Sciences, 2020, 529:28-41. [9] HU J, SHEN L, SUN G.Squeeze-and-excitation networks[C]//Proceedings of IEEE/CVF Conference on Computer Vision and Pattern Recognition.Washington D.C., USA:IEEE Press, 2018:7132-7141. [10] WANG F, JIANG M Q, QIAN C, et al.Residual attention network for image classification[C]//Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.Washington D.C., USA:IEEE Press, 2017:6450-6458. [11] 陈一鸣, 周登文.基于自适应级联的注意力网络的超分辨重建[J/OL].自动化学报:1-11[2021-07-30].https://doi.org/10.16383/j.aas.c200035. CHEN Y M, ZHOU D W.Super-resolution reconstruction of attention network based on adaptive cascade[J/OL].Acta Automatica Sinica:1-11[2021-07-30].https://doi.org/10.16383/j.aas.c200035.(in Chinese) [12] ARJOVSKY M, CHINTALA S, BOTTOU L.Wasserstein generative adversarial networks[C]//Proceedings of the 34th International Conference on Machine Learning.New York, USA:ACM Press, 2017:214-223. [13] MIYATO T, KATAOKA T, KOYAMA M, et al.Spectral normalization for generative adversarial networks[EB/OL].[2021-05-07].https://arxiv.org/pdf/1802.05957.pdf. [14] LI H, WU X J, DURRANI T.NestFuse:an infrared and visible image fusion architecture based on nest connection and spatial/channel attention models[J].IEEE Transactions on Instrumentation and Measurement, 2020, 69(12):9645-9656. [15] 蔡体健, 彭潇雨, 石亚鹏, 等.通道注意力与残差级联的图像超分辨率重建[J].光学精密工程, 2021, 29(1):142-151. CAI T J, PENG X Y, SHI Y P, et al.Channel attention and residual concatenation network for image super-resolution[J].Optics and Precision Engineering, 2021, 29(1):142-151.(in Chinese) [16] 陈卓, 方明, 柴旭, 等.红外与可见光图像融合的U-GAN模型[J].西北工业大学学报, 2020, 38(4):904-912. CHEN Z, FANG M, CHAI X, et al.U-GAN model for infrared and visible images fusion[J].Journal of Northwestern Polytechnical University, 2020, 38(4):904-912.(in Chinese) [17] ALEXANDER T.TNO image fusion dataset[EB/OL].[2021-05-07].https://figshare.com/articles/TNO-Image-Fusion-Dataset/1008029. [18] MA J Y, TANG L F, XU M L, et al.STDFusionNet:an infrared and visible image fusion network based on salient target detection[J].IEEE Transactions on Instrumentation and Measurement, 2021, 70:1-13. [19] 刘佳, 李登峰.马氏距离与引导滤波加权的红外与可见光图像融合[J].红外技术, 2021, 43(2):162-169. LIU J, LI D F.Infrared and visible light image fusion based on mahalanobis distance and guided filter weighting[J].Infrared Technology, 2021, 43(2):162-169.(in Chinese) [20] BURT P J, ADELSON E H.The Laplacian pyramid as a compact image code[J].IEEE Transactions on Communications, 1983, 31(4):532-540. [21] MA J Y, ZHANG H, SHAO Z F, et al.GANMcc:a generative adversarial network with multiclassification constraints for infrared and visible image fusion[J].IEEE Transactions on Instrumentation and Measurement, 2021, 70:1-14. |