[1] MULLENBACH J, WIEGREFFE S, DUKE J, et al.Explainable prediction of medical codes from clinical text[C]//Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies, Volume 1(Long Papers).Stroudsburg, USA:ACL, 2018:1101-1111. [2] XIE X C, XIONG Y, YU P S, et al.EHR coding with multi-scale feature attention and structured knowledge graph propagation[C]//Proceedings of the 28th ACM International Conference on Information and Knowledge Management.New York, USA:ACM Press, 2019:649-658. [3] CHALKIDIS I, FERGADIOTIS E, MALAKASIOTIS P, et al.Large-scale multi-label text classification on EU legislation[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics.Stroudsburg, USA:ACL, 2019:6314-6322. [4] 马慧芳, 贾美惠子, 李晓红, 等.一种基于标签关联关系的微博推荐方法[J].计算机工程, 2016, 42(4):197-201, 208. MA H F, JIA M H Z, LI X J, et al.A microblog recommendation method based on label correlation relationship[J].Computer Engineering, 2016, 42(4):197-201, 208.(in Chinese) [5] RIOS A, KAVULURU R.Few-shot and zero-shot multi-label learning for structured label spaces[C]//Proceedings of 2018 Conference on Empirical Methods in Natural Language Processing.Stroudsburg, USA:ACL, 2018:3132-3142. [6] BRUNA J, ZAREMBA W, SZLAM A, et al.Spectral networks and locally connected networks on graphs[C]//Proceedings of 2014 International Conference on Learning Representations.Banff, Canada:[s.n.], 2014:1-14. [7] NIEPERT M, AHMED M, KUTZKOV K.Learning convolutional neural networks for graphs[C]//Proceedings of the 33rd International Conference on Machine Learning.New York, USA:JMLR, 2016:2014-2023. [8] VELICKOVIC P, CUCURULL G, CASANOVA A, et al.Graph attention networks[C]//Proceedings of 2018 International Conference on Learning Representations.Vancouver, Canada:[s.n.], 2018:1-12. [9] HAMILTON W L, YING R, LESKOVEC J.Inductive representation learning on large graphs[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems.New York, USA:ACM Press, 2017:1025-1035. [10] GUO S N, LIN Y F, FENG Y, et al.Attention based spatial-temporal graph convolutional networks for traffic flow forecasting[C]//Proceedings of 2019 AAAI Conference on Artificial Intelligence.Hawaii, USA:AAAI, 2019:922-929. [11] 刘月, 翟东海, 任庆宁.基于注意力CNLSTM模型的新闻文本分类[J].计算机工程, 2019, 45(7):303-308, 314. LIU Y, ZHAI D H, REN Q N.News text classification based on CNLSTM model with attention mechanism[J].Computer Engineering, 2019, 45(7):303-308, 314.(in Chinese) [12] WANG D X, LIN J B, CUI P, et al.A semi-supervised graph attentive network for financial fraud detection[C]//Proceedings of 2019 IEEE International Conference on Data Mining.Washington D.C., USA:IEEE Press, 2019:598-607. [13] XIE P T, XING E.A neural architecture for automated ICD coding[C]//Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1:Long Papers).Stroudsburg, USA:ACL, 2018:1066-1076. [14] BEN-BARUCH E, RIDNIK T, ZAMIR N, et al.Asymmetric loss for multi-label classification[EB/OL].[2021-04-01].https://arxiv.org/abs/2009.14119. [15] JOHNSON R, ZHANG T.Convolutional neural networks for text categorization:shallow word-level vs.deep character-level[EB/OL].[2021-04-01].https://arxiv.org/abs/1609.00718. [16] LE Q V, MIKOLOV T.Distributed representations of sentences and documents[C]//Proceedings of the 31st International Conference on Machine Learning.Stockholm, Sweden:JMLR, 2018:1188-1196. [17] LI Y J, TARLOW D, BROCKSCHMIDT M, et al.Gated graph sequence neural networks[C]//Proceedings of 2016 International Conference on Learning Representations.San Juan, USA:[s.n.], 2016:273-283. [18] SRIVASTAVA N, HINTON G E, KRIZHEVSKY A, et al.Dropout:a simple way to prevent neural networks from overfitting[J].Journal of Machine Learning Research, 2014, 15(1):1929-1958. [19] KINGMA D P, BA J.Adam:a method for stochastic optimization[C]//Proceedings of 2015 International Conference on Learning Representations.San Diego, USA:[s.n.], 2015:1-15. [20] KIPF T N, WELLING M.Semi-supervised classification with graph convolutional networks[C]//Proceedings of 2016 International Conference on Learning Representations.San Juan, USA:[s.n.], 2016:1-14. [21] KIRYO R, NIU, G, PLESSIS M C., et al.Positive-unlabeled learning with non-negative risk estimator[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems.[S.l.].Curran Associates, 2017:1674-1684. [22] SHU S, LIN Z, YAN Y, et al.Learning from multi-class positive and unlabeled data[C]//Proceedings of 2020 IEEE International Conference on Data Mining.Washington D.C., 2020:1256-1261. |