1 |
包振山, 宋秉彦, 张文博, 等. 基于半监督学习和规则相结合的中医古籍命名实体识别研究. 中文信息学报, 2022, 36 (6): 90- 100.
URL
|
|
BAO Z S, SONG B Y, ZHANG W B, et al. Research on named entity recognition of traditional Chinese medicine ancient books based on the combination of semi supervised learning and rules. Journal of Chinese Information Processing, 2022, 36 (6): 90- 100.
URL
|
2 |
李娜. 基于条件随机场的方志古籍别名自动抽取模型构建. 中文信息学报, 2018, 32 (11): 41-48, 61.
doi: 10.3969/j.issn.1003-0077.2018.11.006
|
|
LI N. Automatic extraction of alias in ancient local chronicles based on conditional random fields. Journal of Chinese Information Processing, 2018, 32 (11): 41-48, 61.
doi: 10.3969/j.issn.1003-0077.2018.11.006
|
3 |
郑亚南, 田大钢. 基于GloVe与SVM的文本分类研究. 软件导刊, 2018, 17 (6): 45-48, 52.
doi: 10.11907/rjdk.172991
|
|
ZHENG Y N, TIAN D G. Research on text classification based on GloVe and SVM. Software Guide, 2018, 17 (6): 45-48, 52.
doi: 10.11907/rjdk.172991
|
4 |
GUI T, MA R T, ZHANG Q, et al. CNN-based Chinese NER with lexicon rethinking[C]//Proceedings of the 28th International Joint Conference on Artificial Intelligence. [S. l.]: International Joint Conferences on Artificial Intelligence Organization, 2019: 4982-4988.
|
5 |
COLLOBERT R, WESTON J, BOTTOU L, et al. Natural language processing (almost) from scratch. Journal of Machine Learning Research, 2011, 12, 2493- 2537.
doi: 10.1016/j.chemolab.2011.03.009
|
6 |
LIN J C W, SHAO Y N, DJENOURI Y, et al. ASRNN: a recurrent neural network with an attention model for sequence labeling. Knowledge-Based Systems, 2021, 212, 106548.
doi: 10.1016/j.knosys.2020.106548
|
7 |
|
8 |
VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems. [S. l.]: Curran Associates Inc., 2017: 6000-6010.
|
9 |
LI X N, YAN H, QIU X P, et al. FLAT: Chinese NER using flat-lattice Transformer[C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA: Association for Computational Linguistics, 2020: 6836-6842.
|
10 |
ZHANG Y, YANG J. Chinese NER using lattice LSTM[C]//Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Stroudsburg, USA: Association for Computational Linguistics, 2018: 1554-1564.
|
11 |
DU J F, GRAVE E, GUNEL B, et al. Self-training improves pre-training for natural language understanding[C]//Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Stroudsburg, USA: Association for Computational Linguistics, 2021: 4171-4186.
|
12 |
LAN Z Z, CHEN M D, GOODMAN S, et al. ALBERT: a lite BERT for self-supervised learning of language representations[EB/OL]. [2023-07-14]. http://arxiv.org/abs/1909.11942.
|
13 |
LIU W, FU X Y, ZHANG Y, et al. Lexicon enhanced Chinese sequence labeling using BERT adapter[C]//Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). Stroudsburg, USA: Association for Computational Linguistics, 2021: 5847-5858.
|
14 |
SUI D B, CHEN Y B, LIU K, et al. Leverage lexical knowledge for Chinese named entity recognition via collaborative graph network[C]//Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP). Stroudsburg, USA: Association for Computational Linguistics, 2019: 3828-3838.
|
15 |
|
16 |
MIKOLOV T, CHEN K, CORRADO G, et al. Efficient estimation of word representations in vector space[EB/OL]. [2023-07-14]. http://arxiv.org/abs/1301.3781.
|
17 |
ZHANG S, WANG L J, SUN K, et al. A practical Chinese dependency parser based on a large-scale dataset[EB/OL]. [2023-07-14]. http://arxiv.org/abs/2009.00901.
|
18 |
ZENG Y, YANG H H, FENG Y S, et al. A convolution BiLSTM neural network model for Chinese event extraction[M]. Berlin, Germany: Springer International Publishing, 2016.
|
19 |
|
20 |
|
21 |
|
|
|
22 |
LAMPLE G, BALLESTEROS M, SUBRAMANIAN S, et al. Neural architectures for named entity recognition[C]//Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Stroudsburg, USA: Association for Computational Linguistics, 2016: 260-270.
|
23 |
XUE M G, YU B W, LIU T W, et al. Porous lattice Transformer encoder for Chinese NER[C]//Proceedings of the 28th International Conference on Computational Linguistics. Stroudsburg, USA: International Committee on Computational Linguistics, 2020: 3831-3841.
|
24 |
LU N J, ZHENG J, WU W, et al. Chinese clinical named entity recognition with word-level information incorporating dictionaries[C]//Proceedings of the International Joint Conference on Neural Networks (IJCNN). Washington D. C., USA: IEEE Press, 2019: 1-8.
|
25 |
QIU J H, ZHOU Y M, WANG Q, et al. Chinese clinical named entity recognition using residual dilated convolutional neural network with conditional random field. IEEE Transactions on NanoBioscience, 2019, 18 (3): 306- 315.
doi: 10.1109/TNB.2019.2908678
|
26 |
唐国强, 高大启, 阮彤, 等. 融入语言模型和注意力机制的临床电子病历命名实体识别. 计算机科学, 2020, 47 (3): 211- 216.
doi: 10.11896/jsjkx.190200259
|
|
TANG G Q, GAO D Q, RUAN T, et al. Clinical electronic medical record named entity recognition incorporating language model and attention mechanism. Computer Science, 2020, 47 (3): 211- 216.
doi: 10.11896/jsjkx.190200259
|
27 |
LI X Y, ZHANG H, ZHOU X H. Chinese clinical named entity recognition with variant neural structures based on BERT methods. Journal of Biomedical Informatics, 2020, 107, 103422.
doi: 10.1016/j.jbi.2020.103422
|
28 |
罗凌, 杨志豪, 宋雅文, 等. 基于笔画ELMo和多任务学习的中文电子病历命名实体识别研究. 计算机学报, 2020, 43 (10): 1943- 1957.
doi: 10.11897/SP.J.1016.2020.01943
|
|
LUO L, YANG Z H, SONG Y W, et al. Chinese clinical named entity recognition based on stroke ELMo and multi-task learning. Chinese Journal of Computers, 2020, 43 (10): 1943- 1957.
doi: 10.11897/SP.J.1016.2020.01943
|
29 |
SONG Y, SHI S M, LI J, et al. Directional skip-gram: explicitly distinguishing left and right context for word embeddings[C]//Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers). Stroudsburg, USA: Association for Computational Linguistics, 2018: 175-180.
|