| Peer-Reviewed

Chinese NER with Softlexion and Residual Gated CNNs

Received: 20 April 2023    Accepted: 18 May 2023    Published: 29 May 2023
Views:       Downloads:
Abstract

The increment of accuracy and speed on Named Entity Recognition (NER), a key task in natural language processing, can further enhance downstream tasks. The method of residual gated convolution and attention mechanism is proposed to address the problem of insufficient recognition of nested entities and ambiguous entities by convolutional layers in the absence of context. It emphasizes local continuous features fusion to global ones to better obtain contextual semantic information in the stacked convolutional layer. Moreover, the optimized embedding layer with fusing character and lexical information by introducing a dictionary combines with a pre-trained BERT model containing a priori semantic effects, and the decoding layer in an entity-level method to alleviate the problem of nested entities and ambiguous entities in long-sequence text. In order to reduce abundant parameters of Bert model, during the training process, only the residual gated convolutional layer is iterated after fixing Bert layer parameters. After experiments on MSRA corpus, the result of entity recognition task in BERT-softlexion-RGCNN-GP model outperforms other models, with an F1 value of 94.96%, and the training speed is also better than that of the bidirectional LSTM model. Our model not only maintains a more efficient training speed but also recognizes Chinese entities more precisely, which is of practical value for fields required accuracy and speed.

Published in American Journal of Computer Science and Technology (Volume 6, Issue 2)
DOI 10.11648/j.ajcst.20230602.13
Page(s) 67-73
Creative Commons

This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited.

Copyright

Copyright © The Author(s), 2024. Published by Science Publishing Group

Keywords

NER, BERT, Lexion, Residual Gated CNNs

References
[1] Jianlin Su. (May. 01, 2021). GlobalPointer: A unified approach to nested and non-nested NERs [Blog post]. Retrieved from https://spaces.ac.cn/archives/8373
[2] LI L S, HE H L, LIU S, et al. Biomedical named entity recognition based on word representation method [J]. Journal of Chinese Computer Systems, 2016, 37 (2): 302-307.
[3] WANG J, LI Y, JIANG X C, et al. Named entity recognition of LSTM based on hierarchical residual connection [J]. Journal of Jiangsu University (Natural Science Edition), 2022, 43 (4): 446-452.
[4] XU X B, WANG T, KANG R, et al. Multi-feature Chinese named entity recognition [J]. Journal of Sichuan University (Natural Science Edition), 2022, 59 (2): 022003.
[5] WANG H B, GAO H K, SHEN Q, et al. Thai language names, place names, and organization names entity recognition [J]. Journal of System Simulation, 2019, 31 (5). 1010-1018.
[6] LI N. Automatic extraction of alias in ancient local chronicles based on conditional random fields [J]. Journal of Chinese Information Processing, 2018, 32 (11): 41-48.
[7] ZOU B W, QIAN Z, CHEN Z C, et al. Negation and un-certainty information extraction oriented to natural language text [J]. Journal of Software, 2016, 27 (2): 309-328.
[8] Collobert R, Weston J, Bottou L, et al. Natural Language Processing (Almost) from Scratch [J]. The Journal of Machine Learning Research, 2011, 12 (1): 2493-253.
[9] HAMMERTON J. Named entity recognition with long short-term memory [C] //Conference on Natural Language Learning at HLT-NAACL. NJ. Association for Computational Linguistics, 2003.
[10] Huang Zhiheng, Xu Wei, Yu Kai. Bidirectional LSTM-CRF models for sequence tagging [J]. ArXiv Preprint ArXiv: 1508.01991, 2015: 1-10.
[11] LAMPLE G, BALLESTEROS M, SUBRAMANIAN S, et al. Neural architectures for named entity recognition [J/OL]. arXiv: 1603. 01360 [cs]. 2016.
[12] MA X, HOVY E. End-to-end sequence labeling via bi-directional LSTM-CNNs-CRF [J/OL]. arXiv: 1603. 01354 [cs]. 2016.
[13] CHIU J P C, NICHOLS E. Named entity recognition with bidirectional LSTM-CNNs [J]. Transactions of the Association for Computational Linguistics, 2016 (4): 357-370.
[14] DONG C H, ZHANG J J, ZONG C Q, et al. Character-based LSTM-CRF with radical-level features for Chinese named entity recognition [M]// Natural Language Understanding and Intelligent Applications. Cham: Springer, 2016: 239-250.
[15] Peng M, Ma R, Zhang Q, et al. Simplify the Usage of Lexicon in Chinese NER [J]. ArXiv: 1908.05969v1, 2019.
[16] ZHANG Y, YANG J. Chinese NER using lattice LSTM [J /OL]. arXiv: 1805. 02023 [cs], 2018.
[17] Yang F, Zhang J, Liu G, et al. Five-Stroke Based CNN-BiRNN-CRF Network for Chinese Named Entity Recognition [M]. Hohhot, China: 7th CCF International Conference, 2018.
[18] STRUBELL E, VERGA P, BELANGER D, et al. Fast and accurate entity recognition with iterated dilated convolutions [J/OL]. arXiv: 1702.02098 [cs], 2017.
[19] Chen H, Lin Z, Ding G, et al. GRN: Gated relation network to enhance convolutional neural network for named entity recognition //Proceedings of the AAAI Conference on Artificial Intelligence. 2019, 33 (1): 6236-6243.
[20] Devlin J, Chang M W, Lee K, et al. Bert: Pre-training of deep bidirectional transformers for language understanding [J]. arXiv preprint arXiv: 1810.04805, 2018.
Cite This Article
  • APA Style

    Zhang Yinglin, Liu Changhui, Huang Shufen. (2023). Chinese NER with Softlexion and Residual Gated CNNs. American Journal of Computer Science and Technology, 6(2), 67-73. https://doi.org/10.11648/j.ajcst.20230602.13

    Copy | Download

    ACS Style

    Zhang Yinglin; Liu Changhui; Huang Shufen. Chinese NER with Softlexion and Residual Gated CNNs. Am. J. Comput. Sci. Technol. 2023, 6(2), 67-73. doi: 10.11648/j.ajcst.20230602.13

    Copy | Download

    AMA Style

    Zhang Yinglin, Liu Changhui, Huang Shufen. Chinese NER with Softlexion and Residual Gated CNNs. Am J Comput Sci Technol. 2023;6(2):67-73. doi: 10.11648/j.ajcst.20230602.13

    Copy | Download

  • @article{10.11648/j.ajcst.20230602.13,
      author = {Zhang Yinglin and Liu Changhui and Huang Shufen},
      title = {Chinese NER with Softlexion and Residual Gated CNNs},
      journal = {American Journal of Computer Science and Technology},
      volume = {6},
      number = {2},
      pages = {67-73},
      doi = {10.11648/j.ajcst.20230602.13},
      url = {https://doi.org/10.11648/j.ajcst.20230602.13},
      eprint = {https://article.sciencepublishinggroup.com/pdf/10.11648.j.ajcst.20230602.13},
      abstract = {The increment of accuracy and speed on Named Entity Recognition (NER), a key task in natural language processing, can further enhance downstream tasks. The method of residual gated convolution and attention mechanism is proposed to address the problem of insufficient recognition of nested entities and ambiguous entities by convolutional layers in the absence of context. It emphasizes local continuous features fusion to global ones to better obtain contextual semantic information in the stacked convolutional layer. Moreover, the optimized embedding layer with fusing character and lexical information by introducing a dictionary combines with a pre-trained BERT model containing a priori semantic effects, and the decoding layer in an entity-level method to alleviate the problem of nested entities and ambiguous entities in long-sequence text. In order to reduce abundant parameters of Bert model, during the training process, only the residual gated convolutional layer is iterated after fixing Bert layer parameters. After experiments on MSRA corpus, the result of entity recognition task in BERT-softlexion-RGCNN-GP model outperforms other models, with an F1 value of 94.96%, and the training speed is also better than that of the bidirectional LSTM model. Our model not only maintains a more efficient training speed but also recognizes Chinese entities more precisely, which is of practical value for fields required accuracy and speed.},
     year = {2023}
    }
    

    Copy | Download

  • TY  - JOUR
    T1  - Chinese NER with Softlexion and Residual Gated CNNs
    AU  - Zhang Yinglin
    AU  - Liu Changhui
    AU  - Huang Shufen
    Y1  - 2023/05/29
    PY  - 2023
    N1  - https://doi.org/10.11648/j.ajcst.20230602.13
    DO  - 10.11648/j.ajcst.20230602.13
    T2  - American Journal of Computer Science and Technology
    JF  - American Journal of Computer Science and Technology
    JO  - American Journal of Computer Science and Technology
    SP  - 67
    EP  - 73
    PB  - Science Publishing Group
    SN  - 2640-012X
    UR  - https://doi.org/10.11648/j.ajcst.20230602.13
    AB  - The increment of accuracy and speed on Named Entity Recognition (NER), a key task in natural language processing, can further enhance downstream tasks. The method of residual gated convolution and attention mechanism is proposed to address the problem of insufficient recognition of nested entities and ambiguous entities by convolutional layers in the absence of context. It emphasizes local continuous features fusion to global ones to better obtain contextual semantic information in the stacked convolutional layer. Moreover, the optimized embedding layer with fusing character and lexical information by introducing a dictionary combines with a pre-trained BERT model containing a priori semantic effects, and the decoding layer in an entity-level method to alleviate the problem of nested entities and ambiguous entities in long-sequence text. In order to reduce abundant parameters of Bert model, during the training process, only the residual gated convolutional layer is iterated after fixing Bert layer parameters. After experiments on MSRA corpus, the result of entity recognition task in BERT-softlexion-RGCNN-GP model outperforms other models, with an F1 value of 94.96%, and the training speed is also better than that of the bidirectional LSTM model. Our model not only maintains a more efficient training speed but also recognizes Chinese entities more precisely, which is of practical value for fields required accuracy and speed.
    VL  - 6
    IS  - 2
    ER  - 

    Copy | Download

Author Information
  • College of Computer Science and Engineering, Wuhan Institute of Technology, Wuhan, China

  • College of Computer Science and Engineering, Wuhan Institute of Technology, Wuhan, China

  • College of Computer Science and Engineering, Wuhan Institute of Technology, Wuhan, China

  • Sections